41016 1727204175.78374: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-bGV executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 41016 1727204175.78808: Added group all to inventory 41016 1727204175.78810: Added group ungrouped to inventory 41016 1727204175.78814: Group all now contains ungrouped 41016 1727204175.78817: Examining possible inventory source: /tmp/network-zt6/inventory-rSl.yml 41016 1727204176.00506: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 41016 1727204176.00566: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 41016 1727204176.00596: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 41016 1727204176.00654: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 41016 1727204176.00732: Loaded config def from plugin (inventory/script) 41016 1727204176.00735: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 41016 1727204176.00773: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 41016 1727204176.00866: Loaded config def from plugin (inventory/yaml) 41016 1727204176.00868: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 41016 1727204176.00960: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 41016 1727204176.01415: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 41016 1727204176.01419: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 41016 1727204176.01422: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 41016 1727204176.01427: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 41016 1727204176.01432: Loading data from /tmp/network-zt6/inventory-rSl.yml 41016 1727204176.01507: /tmp/network-zt6/inventory-rSl.yml was not parsable by auto 41016 1727204176.01577: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 41016 1727204176.01617: Loading data from /tmp/network-zt6/inventory-rSl.yml 41016 1727204176.01706: group all already in inventory 41016 1727204176.01713: set inventory_file for managed-node1 41016 1727204176.01717: set inventory_dir for managed-node1 41016 1727204176.01718: Added host managed-node1 to inventory 41016 1727204176.01720: Added host managed-node1 to group all 41016 1727204176.01721: set ansible_host for managed-node1 41016 1727204176.01722: set ansible_ssh_extra_args for managed-node1 41016 1727204176.01725: set inventory_file for managed-node2 41016 1727204176.01727: set inventory_dir for managed-node2 41016 1727204176.01728: Added host managed-node2 to inventory 41016 1727204176.01729: Added host managed-node2 to group all 41016 1727204176.01730: set ansible_host for managed-node2 41016 1727204176.01731: set ansible_ssh_extra_args for managed-node2 41016 1727204176.01733: set inventory_file for managed-node3 41016 1727204176.01736: set inventory_dir for managed-node3 41016 1727204176.01737: Added host managed-node3 to inventory 41016 1727204176.01739: Added host managed-node3 to group all 41016 1727204176.01741: set ansible_host for managed-node3 41016 1727204176.01741: set ansible_ssh_extra_args for managed-node3 41016 1727204176.01759: Reconcile groups and hosts in inventory. 41016 1727204176.01768: Group ungrouped now contains managed-node1 41016 1727204176.01771: Group ungrouped now contains managed-node2 41016 1727204176.01772: Group ungrouped now contains managed-node3 41016 1727204176.01844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 41016 1727204176.01970: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 41016 1727204176.02024: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 41016 1727204176.02052: Loaded config def from plugin (vars/host_group_vars) 41016 1727204176.02054: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 41016 1727204176.02061: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 41016 1727204176.02068: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 41016 1727204176.02118: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 41016 1727204176.02457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204176.02557: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 41016 1727204176.02598: Loaded config def from plugin (connection/local) 41016 1727204176.02602: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 41016 1727204176.03505: Loaded config def from plugin (connection/paramiko_ssh) 41016 1727204176.03680: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 41016 1727204176.06580: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 41016 1727204176.06626: Loaded config def from plugin (connection/psrp) 41016 1727204176.06629: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 41016 1727204176.08373: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 41016 1727204176.08411: Loaded config def from plugin (connection/ssh) 41016 1727204176.08414: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 41016 1727204176.10482: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 41016 1727204176.10528: Loaded config def from plugin (connection/winrm) 41016 1727204176.10531: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 41016 1727204176.10561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 41016 1727204176.10631: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 41016 1727204176.10698: Loaded config def from plugin (shell/cmd) 41016 1727204176.10700: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 41016 1727204176.10732: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 41016 1727204176.10798: Loaded config def from plugin (shell/powershell) 41016 1727204176.10800: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 41016 1727204176.10857: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 41016 1727204176.11034: Loaded config def from plugin (shell/sh) 41016 1727204176.11040: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 41016 1727204176.11073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 41016 1727204176.11197: Loaded config def from plugin (become/runas) 41016 1727204176.11200: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 41016 1727204176.11391: Loaded config def from plugin (become/su) 41016 1727204176.11394: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 41016 1727204176.11554: Loaded config def from plugin (become/sudo) 41016 1727204176.11556: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 41016 1727204176.11594: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml 41016 1727204176.11921: in VariableManager get_vars() 41016 1727204176.11942: done with get_vars() 41016 1727204176.12071: trying /usr/local/lib/python3.12/site-packages/ansible/modules 41016 1727204176.15007: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 41016 1727204176.15123: in VariableManager get_vars() 41016 1727204176.15128: done with get_vars() 41016 1727204176.15131: variable 'playbook_dir' from source: magic vars 41016 1727204176.15132: variable 'ansible_playbook_python' from source: magic vars 41016 1727204176.15133: variable 'ansible_config_file' from source: magic vars 41016 1727204176.15133: variable 'groups' from source: magic vars 41016 1727204176.15134: variable 'omit' from source: magic vars 41016 1727204176.15135: variable 'ansible_version' from source: magic vars 41016 1727204176.15135: variable 'ansible_check_mode' from source: magic vars 41016 1727204176.15136: variable 'ansible_diff_mode' from source: magic vars 41016 1727204176.15137: variable 'ansible_forks' from source: magic vars 41016 1727204176.15137: variable 'ansible_inventory_sources' from source: magic vars 41016 1727204176.15138: variable 'ansible_skip_tags' from source: magic vars 41016 1727204176.15139: variable 'ansible_limit' from source: magic vars 41016 1727204176.15139: variable 'ansible_run_tags' from source: magic vars 41016 1727204176.15140: variable 'ansible_verbosity' from source: magic vars 41016 1727204176.15181: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml 41016 1727204176.16713: in VariableManager get_vars() 41016 1727204176.16762: done with get_vars() 41016 1727204176.16812: in VariableManager get_vars() 41016 1727204176.16826: done with get_vars() 41016 1727204176.16900: in VariableManager get_vars() 41016 1727204176.16916: done with get_vars() 41016 1727204176.17073: in VariableManager get_vars() 41016 1727204176.17088: done with get_vars() 41016 1727204176.17128: in VariableManager get_vars() 41016 1727204176.17142: done with get_vars() 41016 1727204176.17497: in VariableManager get_vars() 41016 1727204176.17511: done with get_vars() 41016 1727204176.17563: in VariableManager get_vars() 41016 1727204176.17578: done with get_vars() 41016 1727204176.17582: variable 'omit' from source: magic vars 41016 1727204176.17601: variable 'omit' from source: magic vars 41016 1727204176.17636: in VariableManager get_vars() 41016 1727204176.17646: done with get_vars() 41016 1727204176.17693: in VariableManager get_vars() 41016 1727204176.17705: done with get_vars() 41016 1727204176.17743: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 41016 1727204176.17967: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 41016 1727204176.18101: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 41016 1727204176.19194: in VariableManager get_vars() 41016 1727204176.19215: done with get_vars() 41016 1727204176.19851: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 41016 1727204176.20077: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 41016 1727204176.22765: in VariableManager get_vars() 41016 1727204176.22788: done with get_vars() 41016 1727204176.22793: variable 'omit' from source: magic vars 41016 1727204176.22805: variable 'omit' from source: magic vars 41016 1727204176.22841: in VariableManager get_vars() 41016 1727204176.22856: done with get_vars() 41016 1727204176.22879: in VariableManager get_vars() 41016 1727204176.22902: done with get_vars() 41016 1727204176.22934: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 41016 1727204176.23064: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 41016 1727204176.23184: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 41016 1727204176.25658: in VariableManager get_vars() 41016 1727204176.25683: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 41016 1727204176.27794: in VariableManager get_vars() 41016 1727204176.27818: done with get_vars() 41016 1727204176.27850: in VariableManager get_vars() 41016 1727204176.27866: done with get_vars() 41016 1727204176.28003: in VariableManager get_vars() 41016 1727204176.28022: done with get_vars() 41016 1727204176.28052: in VariableManager get_vars() 41016 1727204176.28066: done with get_vars() 41016 1727204176.28107: in VariableManager get_vars() 41016 1727204176.28130: done with get_vars() 41016 1727204176.28166: in VariableManager get_vars() 41016 1727204176.28211: done with get_vars() 41016 1727204176.28273: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 41016 1727204176.28290: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 41016 1727204176.28566: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 41016 1727204176.28818: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 41016 1727204176.28821: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-bGV/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 41016 1727204176.28854: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 41016 1727204176.28886: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 41016 1727204176.29087: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 41016 1727204176.29150: Loaded config def from plugin (callback/default) 41016 1727204176.29153: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 41016 1727204176.31868: Loaded config def from plugin (callback/junit) 41016 1727204176.31880: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 41016 1727204176.31936: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 41016 1727204176.32074: Loaded config def from plugin (callback/minimal) 41016 1727204176.32079: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 41016 1727204176.32226: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 41016 1727204176.32334: Loaded config def from plugin (callback/tree) 41016 1727204176.32337: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 41016 1727204176.32469: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 41016 1727204176.32472: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-bGV/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_route_device_nm.yml ******************************************** 2 plays in /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml 41016 1727204176.32503: in VariableManager get_vars() 41016 1727204176.32518: done with get_vars() 41016 1727204176.32525: in VariableManager get_vars() 41016 1727204176.32535: done with get_vars() 41016 1727204176.32538: variable 'omit' from source: magic vars 41016 1727204176.32574: in VariableManager get_vars() 41016 1727204176.32709: done with get_vars() 41016 1727204176.32733: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_route_device.yml' with nm as provider] ***** 41016 1727204176.33330: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 41016 1727204176.33423: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 41016 1727204176.33493: getting the remaining hosts for this loop 41016 1727204176.33495: done getting the remaining hosts for this loop 41016 1727204176.33498: getting the next task for host managed-node1 41016 1727204176.33502: done getting next task for host managed-node1 41016 1727204176.33504: ^ task is: TASK: Gathering Facts 41016 1727204176.33506: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204176.33508: getting variables 41016 1727204176.33509: in VariableManager get_vars() 41016 1727204176.33518: Calling all_inventory to load vars for managed-node1 41016 1727204176.33521: Calling groups_inventory to load vars for managed-node1 41016 1727204176.33523: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204176.33535: Calling all_plugins_play to load vars for managed-node1 41016 1727204176.33546: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204176.33549: Calling groups_plugins_play to load vars for managed-node1 41016 1727204176.33584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204176.33638: done with get_vars() 41016 1727204176.33644: done getting variables 41016 1727204176.33775: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml:6 Tuesday 24 September 2024 14:56:16 -0400 (0:00:00.014) 0:00:00.014 ***** 41016 1727204176.33799: entering _queue_task() for managed-node1/gather_facts 41016 1727204176.33801: Creating lock for gather_facts 41016 1727204176.34167: worker is 1 (out of 1 available) 41016 1727204176.34423: exiting _queue_task() for managed-node1/gather_facts 41016 1727204176.34438: done queuing things up, now waiting for results queue to drain 41016 1727204176.34439: waiting for pending results... 41016 1727204176.34484: running TaskExecutor() for managed-node1/TASK: Gathering Facts 41016 1727204176.34704: in run() - task 028d2410-947f-12d5-0ec4-0000000000bf 41016 1727204176.34708: variable 'ansible_search_path' from source: unknown 41016 1727204176.34710: calling self._execute() 41016 1727204176.34742: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204176.34754: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204176.34766: variable 'omit' from source: magic vars 41016 1727204176.34891: variable 'omit' from source: magic vars 41016 1727204176.34923: variable 'omit' from source: magic vars 41016 1727204176.34959: variable 'omit' from source: magic vars 41016 1727204176.35012: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204176.35052: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204176.35240: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204176.35243: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204176.35245: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204176.35248: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204176.35250: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204176.35256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204176.35368: Set connection var ansible_shell_executable to /bin/sh 41016 1727204176.35484: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204176.35497: Set connection var ansible_shell_type to sh 41016 1727204176.35509: Set connection var ansible_timeout to 10 41016 1727204176.35519: Set connection var ansible_pipelining to False 41016 1727204176.35536: Set connection var ansible_connection to ssh 41016 1727204176.35595: variable 'ansible_shell_executable' from source: unknown 41016 1727204176.35605: variable 'ansible_connection' from source: unknown 41016 1727204176.35612: variable 'ansible_module_compression' from source: unknown 41016 1727204176.35641: variable 'ansible_shell_type' from source: unknown 41016 1727204176.35644: variable 'ansible_shell_executable' from source: unknown 41016 1727204176.35647: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204176.35649: variable 'ansible_pipelining' from source: unknown 41016 1727204176.35651: variable 'ansible_timeout' from source: unknown 41016 1727204176.35653: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204176.35877: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204176.36045: variable 'omit' from source: magic vars 41016 1727204176.36048: starting attempt loop 41016 1727204176.36050: running the handler 41016 1727204176.36053: variable 'ansible_facts' from source: unknown 41016 1727204176.36055: _low_level_execute_command(): starting 41016 1727204176.36057: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204176.37098: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204176.37116: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204176.37192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204176.37235: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204176.37252: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204176.37273: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204176.37692: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204176.39350: stdout chunk (state=3): >>>/root <<< 41016 1727204176.39497: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204176.39502: stdout chunk (state=3): >>><<< 41016 1727204176.39505: stderr chunk (state=3): >>><<< 41016 1727204176.39890: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204176.39894: _low_level_execute_command(): starting 41016 1727204176.39897: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204176.3959632-41095-140305170878349 `" && echo ansible-tmp-1727204176.3959632-41095-140305170878349="` echo /root/.ansible/tmp/ansible-tmp-1727204176.3959632-41095-140305170878349 `" ) && sleep 0' 41016 1727204176.41061: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 41016 1727204176.41165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204176.41517: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204176.41598: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204176.43691: stdout chunk (state=3): >>>ansible-tmp-1727204176.3959632-41095-140305170878349=/root/.ansible/tmp/ansible-tmp-1727204176.3959632-41095-140305170878349 <<< 41016 1727204176.43842: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204176.43853: stdout chunk (state=3): >>><<< 41016 1727204176.43982: stderr chunk (state=3): >>><<< 41016 1727204176.43987: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204176.3959632-41095-140305170878349=/root/.ansible/tmp/ansible-tmp-1727204176.3959632-41095-140305170878349 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204176.43989: variable 'ansible_module_compression' from source: unknown 41016 1727204176.44113: ANSIBALLZ: Using generic lock for ansible.legacy.setup 41016 1727204176.44122: ANSIBALLZ: Acquiring lock 41016 1727204176.44129: ANSIBALLZ: Lock acquired: 140580610774160 41016 1727204176.44137: ANSIBALLZ: Creating module 41016 1727204176.80889: ANSIBALLZ: Writing module into payload 41016 1727204176.81122: ANSIBALLZ: Writing module 41016 1727204176.81147: ANSIBALLZ: Renaming module 41016 1727204176.81159: ANSIBALLZ: Done creating module 41016 1727204176.81207: variable 'ansible_facts' from source: unknown 41016 1727204176.81232: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204176.81319: _low_level_execute_command(): starting 41016 1727204176.81322: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 41016 1727204176.82320: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204176.82379: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204176.82394: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204176.82405: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204176.82517: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204176.84292: stdout chunk (state=3): >>>PLATFORM <<< 41016 1727204176.84395: stdout chunk (state=3): >>>Linux <<< 41016 1727204176.84399: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 <<< 41016 1727204176.84425: stdout chunk (state=3): >>>/usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 41016 1727204176.84598: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204176.84601: stdout chunk (state=3): >>><<< 41016 1727204176.84604: stderr chunk (state=3): >>><<< 41016 1727204176.84619: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204176.84682 [managed-node1]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 41016 1727204176.84698: _low_level_execute_command(): starting 41016 1727204176.84709: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 41016 1727204176.84820: Sending initial data 41016 1727204176.84823: Sent initial data (1181 bytes) 41016 1727204176.85393: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204176.85461: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204176.85472: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204176.85509: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204176.85622: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204176.89324: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 41016 1727204176.89886: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204176.89890: stdout chunk (state=3): >>><<< 41016 1727204176.89894: stderr chunk (state=3): >>><<< 41016 1727204176.89898: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204176.89933: variable 'ansible_facts' from source: unknown 41016 1727204176.89942: variable 'ansible_facts' from source: unknown 41016 1727204176.89956: variable 'ansible_module_compression' from source: unknown 41016 1727204176.90007: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-410168h8uvyln/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 41016 1727204176.90044: variable 'ansible_facts' from source: unknown 41016 1727204176.90215: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204176.3959632-41095-140305170878349/AnsiballZ_setup.py 41016 1727204176.90362: Sending initial data 41016 1727204176.90365: Sent initial data (154 bytes) 41016 1727204176.91098: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204176.91145: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204176.91149: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204176.91151: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204176.91255: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204176.92988: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 41016 1727204176.93014: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204176.93091: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204176.93190: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmp8e14zcf0 /root/.ansible/tmp/ansible-tmp-1727204176.3959632-41095-140305170878349/AnsiballZ_setup.py <<< 41016 1727204176.93193: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204176.3959632-41095-140305170878349/AnsiballZ_setup.py" <<< 41016 1727204176.93261: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmp8e14zcf0" to remote "/root/.ansible/tmp/ansible-tmp-1727204176.3959632-41095-140305170878349/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204176.3959632-41095-140305170878349/AnsiballZ_setup.py" <<< 41016 1727204176.95005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204176.95008: stderr chunk (state=3): >>><<< 41016 1727204176.95011: stdout chunk (state=3): >>><<< 41016 1727204176.95014: done transferring module to remote 41016 1727204176.95016: _low_level_execute_command(): starting 41016 1727204176.95019: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204176.3959632-41095-140305170878349/ /root/.ansible/tmp/ansible-tmp-1727204176.3959632-41095-140305170878349/AnsiballZ_setup.py && sleep 0' 41016 1727204176.95688: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204176.95801: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204176.95828: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204176.95941: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204176.97956: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204176.97960: stdout chunk (state=3): >>><<< 41016 1727204176.97962: stderr chunk (state=3): >>><<< 41016 1727204176.98058: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204176.98061: _low_level_execute_command(): starting 41016 1727204176.98064: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204176.3959632-41095-140305170878349/AnsiballZ_setup.py && sleep 0' 41016 1727204176.98644: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204176.98657: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204176.98671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204176.98689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204176.98707: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204176.98731: stderr chunk (state=3): >>>debug2: match not found <<< 41016 1727204176.98790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204176.98846: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204176.98863: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204176.98885: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204176.98999: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204177.01404: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 41016 1727204177.01429: stdout chunk (state=3): >>>import _imp # builtin <<< 41016 1727204177.01461: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 41016 1727204177.01529: stdout chunk (state=3): >>>import '_io' # <<< 41016 1727204177.01551: stdout chunk (state=3): >>>import 'marshal' # <<< 41016 1727204177.01609: stdout chunk (state=3): >>>import 'posix' # <<< 41016 1727204177.01616: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 41016 1727204177.01637: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 41016 1727204177.01699: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 41016 1727204177.01721: stdout chunk (state=3): >>>import '_codecs' # <<< 41016 1727204177.01745: stdout chunk (state=3): >>>import 'codecs' # <<< 41016 1727204177.01756: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 41016 1727204177.01801: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272f684d0><<< 41016 1727204177.01833: stdout chunk (state=3): >>> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272f37b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 41016 1727204177.01848: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272f6aa50> <<< 41016 1727204177.01881: stdout chunk (state=3): >>>import '_signal' # <<< 41016 1727204177.01896: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 41016 1727204177.01932: stdout chunk (state=3): >>>import 'io' # <<< 41016 1727204177.01948: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 41016 1727204177.02038: stdout chunk (state=3): >>>import '_collections_abc' # <<< 41016 1727204177.02066: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 41016 1727204177.02107: stdout chunk (state=3): >>>import 'os' # <<< 41016 1727204177.02114: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 41016 1727204177.02164: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 41016 1727204177.02171: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 41016 1727204177.02185: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 41016 1727204177.02209: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 41016 1727204177.02242: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272d1d130> <<< 41016 1727204177.02303: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 41016 1727204177.02306: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 41016 1727204177.02309: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272d1e060> <<< 41016 1727204177.02329: stdout chunk (state=3): >>>import 'site' # <<< 41016 1727204177.02370: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 41016 1727204177.02754: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 41016 1727204177.02793: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 41016 1727204177.02796: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 41016 1727204177.02828: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 41016 1727204177.02872: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 41016 1727204177.02890: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 41016 1727204177.02934: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 41016 1727204177.02958: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272d5bec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 41016 1727204177.02995: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 41016 1727204177.03001: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272d5bf80> <<< 41016 1727204177.03021: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 41016 1727204177.03061: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 41016 1727204177.03074: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 41016 1727204177.03128: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 41016 1727204177.03164: stdout chunk (state=3): >>>import 'itertools' # <<< 41016 1727204177.03184: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272d93830> <<< 41016 1727204177.03223: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272d93ec0> <<< 41016 1727204177.03226: stdout chunk (state=3): >>>import '_collections' # <<< 41016 1727204177.03283: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272d73b60> import '_functools' # <<< 41016 1727204177.03314: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272d712b0> <<< 41016 1727204177.03403: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272d59070> <<< 41016 1727204177.03438: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 41016 1727204177.03460: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 41016 1727204177.03486: stdout chunk (state=3): >>>import '_sre' # <<< 41016 1727204177.03518: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 41016 1727204177.03540: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 41016 1727204177.03553: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 41016 1727204177.03580: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272db77d0> <<< 41016 1727204177.03606: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272db63f0> <<< 41016 1727204177.03631: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272d72150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272db4b30> <<< 41016 1727204177.03687: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 41016 1727204177.03706: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272de8890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272d582f0> <<< 41016 1727204177.03759: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 41016 1727204177.03762: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 41016 1727204177.03773: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0272de8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272de8bf0> <<< 41016 1727204177.03802: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0272de8fe0> <<< 41016 1727204177.03841: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272d56e10> <<< 41016 1727204177.03870: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 41016 1727204177.03873: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 41016 1727204177.03928: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 41016 1727204177.03934: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272de9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272de9370> <<< 41016 1727204177.03978: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 41016 1727204177.03982: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 41016 1727204177.04007: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272dea540> import 'importlib.util' # import 'runpy' # <<< 41016 1727204177.04027: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 41016 1727204177.04098: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 41016 1727204177.04118: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272e04740> import 'errno' # <<< 41016 1727204177.04160: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 41016 1727204177.04186: stdout chunk (state=3): >>>import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0272e05e80> <<< 41016 1727204177.04226: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 41016 1727204177.04237: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272e06d20> <<< 41016 1727204177.04287: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0272e07350> <<< 41016 1727204177.04317: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272e06270> <<< 41016 1727204177.04334: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 41016 1727204177.04356: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 41016 1727204177.04382: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0272e07dd0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272e07500> <<< 41016 1727204177.04452: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272dea4b0> <<< 41016 1727204177.04455: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 41016 1727204177.04486: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 41016 1727204177.04498: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 41016 1727204177.04526: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc'<<< 41016 1727204177.04557: stdout chunk (state=3): >>> <<< 41016 1727204177.04586: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0272affd40> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 41016 1727204177.04630: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 41016 1727204177.04633: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0272b28860> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272b285c0> <<< 41016 1727204177.04653: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0272b28770> <<< 41016 1727204177.04694: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 41016 1727204177.04761: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 41016 1727204177.04902: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0272b29100> <<< 41016 1727204177.05050: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0272b29a90> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272b289b0> <<< 41016 1727204177.05069: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272afdee0> <<< 41016 1727204177.05091: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 41016 1727204177.05133: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 41016 1727204177.05161: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272b2aea0> <<< 41016 1727204177.05182: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272b29be0> <<< 41016 1727204177.05204: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272deac60> <<< 41016 1727204177.05228: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 41016 1727204177.05301: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 41016 1727204177.05321: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 41016 1727204177.05364: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 41016 1727204177.05387: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272b53200> <<< 41016 1727204177.05449: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 41016 1727204177.05495: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 41016 1727204177.05498: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 41016 1727204177.05518: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 41016 1727204177.05553: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272b7b590> <<< 41016 1727204177.05564: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 41016 1727204177.05622: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 41016 1727204177.05672: stdout chunk (state=3): >>>import 'ntpath' # <<< 41016 1727204177.05711: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272bd82f0> <<< 41016 1727204177.05751: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 41016 1727204177.05754: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 41016 1727204177.05780: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 41016 1727204177.05824: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 41016 1727204177.05920: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272bdaa50> <<< 41016 1727204177.05994: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272bd8410> <<< 41016 1727204177.06061: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272ba1340> <<< 41016 1727204177.06092: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272511340> <<< 41016 1727204177.06105: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272b7a390> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272b2bdd0> <<< 41016 1727204177.06297: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 41016 1727204177.06313: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f0272b7a990> <<< 41016 1727204177.06595: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_omou28p7/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 41016 1727204177.06741: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.06754: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 41016 1727204177.06803: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 41016 1727204177.06883: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 41016 1727204177.06933: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f02725770b0> <<< 41016 1727204177.06936: stdout chunk (state=3): >>>import '_typing' # <<< 41016 1727204177.07128: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272555fa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272555130> <<< 41016 1727204177.07132: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.07169: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 41016 1727204177.07219: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 41016 1727204177.07222: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 41016 1727204177.07238: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.08711: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.09944: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272574f80> <<< 41016 1727204177.09974: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 41016 1727204177.09997: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 41016 1727204177.10045: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 41016 1727204177.10074: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 41016 1727204177.10089: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f02725aaa20> <<< 41016 1727204177.10119: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f02725aa7b0> <<< 41016 1727204177.10166: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f02725aa0c0> <<< 41016 1727204177.10169: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 41016 1727204177.10191: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 41016 1727204177.10230: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f02725aa510> <<< 41016 1727204177.10246: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272577d40> import 'atexit' # <<< 41016 1727204177.10296: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f02725ab7a0> <<< 41016 1727204177.10300: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f02725ab980> <<< 41016 1727204177.10321: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 41016 1727204177.10388: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 41016 1727204177.10398: stdout chunk (state=3): >>>import '_locale' # <<< 41016 1727204177.10483: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f02725abec0> <<< 41016 1727204177.10534: stdout chunk (state=3): >>>import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 41016 1727204177.10591: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272415c70> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0272417860> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 41016 1727204177.10672: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 41016 1727204177.10833: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f027241c260> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 41016 1727204177.10836: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f027241d400> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 41016 1727204177.10861: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f027241fef0> <<< 41016 1727204177.10904: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0272420230> <<< 41016 1727204177.10953: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f027241e1b0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 41016 1727204177.11042: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 41016 1727204177.11063: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 41016 1727204177.11205: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 41016 1727204177.11226: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272423e30> import '_tokenize' # <<< 41016 1727204177.11307: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272422900> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272422660> <<< 41016 1727204177.11330: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 41016 1727204177.11411: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272422bd0> <<< 41016 1727204177.11516: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f027241e6c0> <<< 41016 1727204177.11539: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0272467ef0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272468200> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 41016 1727204177.11569: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 41016 1727204177.11747: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0272469c70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272469a30> <<< 41016 1727204177.11750: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f027246c1a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f027246a330> <<< 41016 1727204177.11762: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 41016 1727204177.11806: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 41016 1727204177.11847: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 41016 1727204177.11893: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f027246f980> <<< 41016 1727204177.12028: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f027246c350> <<< 41016 1727204177.12088: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0272470770> <<< 41016 1727204177.12171: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f02724707a0> <<< 41016 1727204177.12211: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0272470ad0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f02724682f0> <<< 41016 1727204177.12283: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 41016 1727204177.12394: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 41016 1727204177.12424: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0272300320> <<< 41016 1727204177.12472: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f02723013d0> <<< 41016 1727204177.12494: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272472ab0> <<< 41016 1727204177.12562: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 41016 1727204177.12588: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0272473e60> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272472720> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 41016 1727204177.12685: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.12770: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 41016 1727204177.12813: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # # zipimport: zlib available <<< 41016 1727204177.12860: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 41016 1727204177.12863: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.12964: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.13083: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.13651: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.14237: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 41016 1727204177.14255: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 41016 1727204177.14288: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 41016 1727204177.14355: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so'<<< 41016 1727204177.14366: stdout chunk (state=3): >>> import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f02723055b0> <<< 41016 1727204177.14440: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 41016 1727204177.14467: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272306360> <<< 41016 1727204177.14484: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272301fd0> <<< 41016 1727204177.14656: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 41016 1727204177.15207: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272306330> # zipimport: zlib available <<< 41016 1727204177.15433: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.15912: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.15992: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.16065: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 41016 1727204177.16092: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.16118: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.16144: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 41016 1727204177.16199: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.16231: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.16337: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available <<< 41016 1727204177.16362: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 41016 1727204177.16418: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.16449: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 41016 1727204177.16465: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.16704: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.16953: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 41016 1727204177.17016: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 41016 1727204177.17065: stdout chunk (state=3): >>>import '_ast' # <<< 41016 1727204177.17188: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272307590> # zipimport: zlib available # zipimport: zlib available <<< 41016 1727204177.17266: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 41016 1727204177.17411: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 41016 1727204177.17451: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.17496: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.17630: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 41016 1727204177.17681: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 41016 1727204177.17784: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0272311fd0> <<< 41016 1727204177.17819: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f027230cf80> <<< 41016 1727204177.17950: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 41016 1727204177.17971: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.18000: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.18058: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.18092: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 41016 1727204177.18180: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 41016 1727204177.18272: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 41016 1727204177.18279: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 41016 1727204177.18289: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 41016 1727204177.18352: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f02723f6a80> <<< 41016 1727204177.18369: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f02724e6750> <<< 41016 1727204177.18500: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272312210> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272311e20> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 41016 1727204177.18512: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.18644: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 41016 1727204177.18648: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 41016 1727204177.18650: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.18698: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.18845: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 41016 1727204177.18851: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.18906: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.18950: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.18972: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 41016 1727204177.19058: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.19134: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.19205: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat.typing' # <<< 41016 1727204177.19208: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.19399: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.19625: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.19668: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.19736: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 41016 1727204177.19747: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 41016 1727204177.20013: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f02723a60c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0271f6ffe0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0271f74410> <<< 41016 1727204177.20054: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f027238ea50> <<< 41016 1727204177.20081: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f02723a6c60> <<< 41016 1727204177.20104: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f02723a47a0> <<< 41016 1727204177.20137: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f02723a43e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 41016 1727204177.20235: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 41016 1727204177.20283: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 41016 1727204177.20286: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 41016 1727204177.20321: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0271f77230> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0271f76ae0> <<< 41016 1727204177.20372: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0271f76cc0> <<< 41016 1727204177.20379: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0271f75f10> <<< 41016 1727204177.20402: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 41016 1727204177.20539: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 41016 1727204177.20580: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0271f77410> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 41016 1727204177.20614: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 41016 1727204177.20637: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' <<< 41016 1727204177.20662: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0271fd9ee0> <<< 41016 1727204177.20702: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0271f77ef0> <<< 41016 1727204177.20800: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f02723a6120> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # <<< 41016 1727204177.20814: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 41016 1727204177.20846: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.20923: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 41016 1727204177.20927: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.20977: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.21059: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 41016 1727204177.21138: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.21141: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 41016 1727204177.21156: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 41016 1727204177.21201: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.21257: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 41016 1727204177.21333: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.21355: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 41016 1727204177.21503: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 41016 1727204177.21529: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.21600: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 41016 1727204177.21692: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 41016 1727204177.22106: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.22571: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 41016 1727204177.22626: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.22688: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.22717: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.22796: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 41016 1727204177.22800: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.22932: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available <<< 41016 1727204177.22955: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 41016 1727204177.22977: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.23006: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.23043: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 41016 1727204177.23081: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.23109: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 41016 1727204177.23138: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.23252: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.23297: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 41016 1727204177.23351: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0271fdbdd0> <<< 41016 1727204177.23385: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 41016 1727204177.23462: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 41016 1727204177.23577: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0271fda8a0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 41016 1727204177.23604: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.23879: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 41016 1727204177.23915: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.23951: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available <<< 41016 1727204177.24003: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 41016 1727204177.24030: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.24053: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.24104: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 41016 1727204177.24162: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 41016 1727204177.24687: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f02720120c0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272002d50> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 41016 1727204177.24712: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 41016 1727204177.24824: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 41016 1727204177.25032: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.25094: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 41016 1727204177.25097: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.25138: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.25189: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 41016 1727204177.25192: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.25231: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.25271: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 41016 1727204177.25293: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 41016 1727204177.25313: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 41016 1727204177.25337: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0271d55e80> <<< 41016 1727204177.25370: stdout chunk (state=3): >>>import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0271d57830> import 'ansible.module_utils.facts.system.user' # <<< 41016 1727204177.25490: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 41016 1727204177.25658: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.25819: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 41016 1727204177.25831: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.25924: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.26032: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.26065: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.26108: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 41016 1727204177.26142: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.26158: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.26178: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.26322: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.26482: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 41016 1727204177.26494: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.26613: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.26732: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 41016 1727204177.26756: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.26786: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.26819: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.27409: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.27978: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 41016 1727204177.27991: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.28101: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.28216: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 41016 1727204177.28231: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.28325: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.28435: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 41016 1727204177.28446: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.28598: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.28783: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 41016 1727204177.28789: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.28804: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 41016 1727204177.28859: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.28963: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 41016 1727204177.28966: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.29010: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.29115: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.29336: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.29540: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 41016 1727204177.29561: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available <<< 41016 1727204177.29656: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available <<< 41016 1727204177.29730: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.29742: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 41016 1727204177.29767: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.29906: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available <<< 41016 1727204177.29923: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 41016 1727204177.30070: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.30097: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 41016 1727204177.30132: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.30162: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 41016 1727204177.30190: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.30455: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.30852: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 41016 1727204177.30856: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.30879: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available <<< 41016 1727204177.30906: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.30945: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 41016 1727204177.30991: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.30994: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.31086: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available <<< 41016 1727204177.31089: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.31125: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 41016 1727204177.31310: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.31335: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 41016 1727204177.31382: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.31428: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available <<< 41016 1727204177.31454: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.31534: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 41016 1727204177.31573: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.31680: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.31749: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 41016 1727204177.32062: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 41016 1727204177.32075: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.32244: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 41016 1727204177.32257: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.32315: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.32355: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 41016 1727204177.32366: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.32412: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.32452: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 41016 1727204177.32472: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.32549: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.32633: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 41016 1727204177.32659: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.32736: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.32837: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 41016 1727204177.32952: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204177.33404: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0271d827e0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0271d82cf0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0271d807a0> <<< 41016 1727204177.48180: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0271dc9250> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0271dc9fd0> <<< 41016 1727204177.48499: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0271e146e0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0271e14230> <<< 41016 1727204177.48607: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 41016 1727204177.69183: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_loadavg": {"1m": 0.640625, "5m": 0.5478515625, "15m": 0.30126953125}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_is_chroot": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fips": false, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQ<<< 41016 1727204177.69238: stdout chunk (state=3): >>>UESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "17", "epoch": "1727204177", "epoch_int": "1727204177", "date": "2024-09-24", "time": "14:56:17", "iso8601_micro": "2024-09-24T18:56:17.342623Z", "iso8601": "2024-09-24T18:56:17Z", "iso8601_basic": "20240924T145617342623", "iso8601_basic_short": "20240924T145617", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ddff:fe89:9be5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.47"], "ansible_all_ipv6_addresses": ["fe80::8ff:ddff:fe89:9be5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.47", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ddff:fe89:9be5"]}, "ansible_service_mgr": "systemd", "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2925, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 606, "free": 2925}, "nocache": {"free": 3284, "used": 247}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_uuid": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuid<<< 41016 1727204177.69265: stdout chunk (state=3): >>>s": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 768, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785350144, "block_size": 4096, "block_total": 65519099, "block_available": 63912439, "block_used": 1606660, "inode_total": 131070960, "inode_available": 131027259, "inode_used": 43701, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_iscsi_iqn": "", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 41016 1727204177.70132: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ <<< 41016 1727204177.70287: stdout chunk (state=3): >>># clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file <<< 41016 1727204177.70343: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 41016 1727204177.70922: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 41016 1727204177.71012: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 <<< 41016 1727204177.71050: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath <<< 41016 1727204177.71118: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess <<< 41016 1727204177.71208: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 41016 1727204177.71238: stdout chunk (state=3): >>># destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 41016 1727204177.71299: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle <<< 41016 1727204177.71303: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue <<< 41016 1727204177.71540: stdout chunk (state=3): >>># destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 <<< 41016 1727204177.71544: stdout chunk (state=3): >>># destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep<<< 41016 1727204177.71584: stdout chunk (state=3): >>> # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 41016 1727204177.71679: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading <<< 41016 1727204177.71683: stdout chunk (state=3): >>># cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser <<< 41016 1727204177.71686: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types <<< 41016 1727204177.71726: stdout chunk (state=3): >>># cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io <<< 41016 1727204177.71749: stdout chunk (state=3): >>># destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 41016 1727204177.71957: stdout chunk (state=3): >>># destroy sys.monitoring <<< 41016 1727204177.71981: stdout chunk (state=3): >>># destroy _socket # destroy _collections <<< 41016 1727204177.72066: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 41016 1727204177.72152: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 41016 1727204177.72156: stdout chunk (state=3): >>># destroy _frozen_importlib_external <<< 41016 1727204177.72209: stdout chunk (state=3): >>># destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 41016 1727204177.72263: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading <<< 41016 1727204177.72395: stdout chunk (state=3): >>># destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 41016 1727204177.72896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 41016 1727204177.72906: stdout chunk (state=3): >>><<< 41016 1727204177.72920: stderr chunk (state=3): >>><<< 41016 1727204177.73240: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272f684d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272f37b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272f6aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272d1d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272d1e060> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272d5bec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272d5bf80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272d93830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272d93ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272d73b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272d712b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272d59070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272db77d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272db63f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272d72150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272db4b30> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272de8890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272d582f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0272de8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272de8bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0272de8fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272d56e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272de9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272de9370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272dea540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272e04740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0272e05e80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272e06d20> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0272e07350> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272e06270> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0272e07dd0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272e07500> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272dea4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0272affd40> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0272b28860> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272b285c0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0272b28770> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0272b29100> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0272b29a90> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272b289b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272afdee0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272b2aea0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272b29be0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272deac60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272b53200> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272b7b590> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272bd82f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272bdaa50> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272bd8410> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272ba1340> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272511340> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272b7a390> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272b2bdd0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f0272b7a990> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_omou28p7/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f02725770b0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272555fa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272555130> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272574f80> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f02725aaa20> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f02725aa7b0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f02725aa0c0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f02725aa510> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272577d40> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f02725ab7a0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f02725ab980> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f02725abec0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272415c70> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0272417860> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f027241c260> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f027241d400> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f027241fef0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0272420230> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f027241e1b0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272423e30> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272422900> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272422660> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272422bd0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f027241e6c0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0272467ef0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272468200> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0272469c70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272469a30> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f027246c1a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f027246a330> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f027246f980> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f027246c350> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0272470770> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f02724707a0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0272470ad0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f02724682f0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0272300320> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f02723013d0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272472ab0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0272473e60> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272472720> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f02723055b0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272306360> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272301fd0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272306330> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272307590> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0272311fd0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f027230cf80> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f02723f6a80> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f02724e6750> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272312210> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272311e20> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f02723a60c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0271f6ffe0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0271f74410> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f027238ea50> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f02723a6c60> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f02723a47a0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f02723a43e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0271f77230> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0271f76ae0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0271f76cc0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0271f75f10> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0271f77410> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0271fd9ee0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0271f77ef0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f02723a6120> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0271fdbdd0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0271fda8a0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f02720120c0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0272002d50> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0271d55e80> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0271d57830> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f0271d827e0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0271d82cf0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0271d807a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0271dc9250> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0271dc9fd0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0271e146e0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f0271e14230> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_loadavg": {"1m": 0.640625, "5m": 0.5478515625, "15m": 0.30126953125}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_is_chroot": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fips": false, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "17", "epoch": "1727204177", "epoch_int": "1727204177", "date": "2024-09-24", "time": "14:56:17", "iso8601_micro": "2024-09-24T18:56:17.342623Z", "iso8601": "2024-09-24T18:56:17Z", "iso8601_basic": "20240924T145617342623", "iso8601_basic_short": "20240924T145617", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ddff:fe89:9be5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.47"], "ansible_all_ipv6_addresses": ["fe80::8ff:ddff:fe89:9be5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.47", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ddff:fe89:9be5"]}, "ansible_service_mgr": "systemd", "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2925, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 606, "free": 2925}, "nocache": {"free": 3284, "used": 247}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_uuid": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 768, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785350144, "block_size": 4096, "block_total": 65519099, "block_available": 63912439, "block_used": 1606660, "inode_total": 131070960, "inode_available": 131027259, "inode_used": 43701, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_iscsi_iqn": "", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed-node1 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 41016 1727204177.75606: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204176.3959632-41095-140305170878349/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204177.75638: _low_level_execute_command(): starting 41016 1727204177.75648: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204176.3959632-41095-140305170878349/ > /dev/null 2>&1 && sleep 0' 41016 1727204177.76399: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204177.76425: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204177.76446: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204177.76468: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204177.76711: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 41016 1727204177.80083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204177.80087: stderr chunk (state=3): >>><<< 41016 1727204177.80090: stdout chunk (state=3): >>><<< 41016 1727204177.80092: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 41016 1727204177.80095: handler run complete 41016 1727204177.80097: variable 'ansible_facts' from source: unknown 41016 1727204177.80228: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204177.80901: variable 'ansible_facts' from source: unknown 41016 1727204177.81052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204177.81411: attempt loop complete, returning result 41016 1727204177.81420: _execute() done 41016 1727204177.81426: dumping result to json 41016 1727204177.81459: done dumping result, returning 41016 1727204177.81471: done running TaskExecutor() for managed-node1/TASK: Gathering Facts [028d2410-947f-12d5-0ec4-0000000000bf] 41016 1727204177.81481: sending task result for task 028d2410-947f-12d5-0ec4-0000000000bf 41016 1727204177.82809: done sending task result for task 028d2410-947f-12d5-0ec4-0000000000bf 41016 1727204177.82815: WORKER PROCESS EXITING ok: [managed-node1] 41016 1727204177.83336: no more pending results, returning what we have 41016 1727204177.83339: results queue empty 41016 1727204177.83340: checking for any_errors_fatal 41016 1727204177.83341: done checking for any_errors_fatal 41016 1727204177.83342: checking for max_fail_percentage 41016 1727204177.83344: done checking for max_fail_percentage 41016 1727204177.83344: checking to see if all hosts have failed and the running result is not ok 41016 1727204177.83345: done checking to see if all hosts have failed 41016 1727204177.83346: getting the remaining hosts for this loop 41016 1727204177.83347: done getting the remaining hosts for this loop 41016 1727204177.83351: getting the next task for host managed-node1 41016 1727204177.83357: done getting next task for host managed-node1 41016 1727204177.83359: ^ task is: TASK: meta (flush_handlers) 41016 1727204177.83361: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204177.83365: getting variables 41016 1727204177.83366: in VariableManager get_vars() 41016 1727204177.83392: Calling all_inventory to load vars for managed-node1 41016 1727204177.83395: Calling groups_inventory to load vars for managed-node1 41016 1727204177.83398: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204177.83407: Calling all_plugins_play to load vars for managed-node1 41016 1727204177.83409: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204177.83412: Calling groups_plugins_play to load vars for managed-node1 41016 1727204177.83808: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204177.84183: done with get_vars() 41016 1727204177.84194: done getting variables 41016 1727204177.84373: in VariableManager get_vars() 41016 1727204177.84384: Calling all_inventory to load vars for managed-node1 41016 1727204177.84386: Calling groups_inventory to load vars for managed-node1 41016 1727204177.84388: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204177.84393: Calling all_plugins_play to load vars for managed-node1 41016 1727204177.84395: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204177.84398: Calling groups_plugins_play to load vars for managed-node1 41016 1727204177.84652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204177.85045: done with get_vars() 41016 1727204177.85059: done queuing things up, now waiting for results queue to drain 41016 1727204177.85062: results queue empty 41016 1727204177.85062: checking for any_errors_fatal 41016 1727204177.85065: done checking for any_errors_fatal 41016 1727204177.85066: checking for max_fail_percentage 41016 1727204177.85067: done checking for max_fail_percentage 41016 1727204177.85068: checking to see if all hosts have failed and the running result is not ok 41016 1727204177.85068: done checking to see if all hosts have failed 41016 1727204177.85074: getting the remaining hosts for this loop 41016 1727204177.85077: done getting the remaining hosts for this loop 41016 1727204177.85080: getting the next task for host managed-node1 41016 1727204177.85085: done getting next task for host managed-node1 41016 1727204177.85088: ^ task is: TASK: Include the task 'el_repo_setup.yml' 41016 1727204177.85089: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204177.85091: getting variables 41016 1727204177.85092: in VariableManager get_vars() 41016 1727204177.85100: Calling all_inventory to load vars for managed-node1 41016 1727204177.85103: Calling groups_inventory to load vars for managed-node1 41016 1727204177.85106: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204177.85110: Calling all_plugins_play to load vars for managed-node1 41016 1727204177.85112: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204177.85115: Calling groups_plugins_play to load vars for managed-node1 41016 1727204177.85435: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204177.85843: done with get_vars() 41016 1727204177.85852: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml:11 Tuesday 24 September 2024 14:56:17 -0400 (0:00:01.522) 0:00:01.536 ***** 41016 1727204177.86052: entering _queue_task() for managed-node1/include_tasks 41016 1727204177.86054: Creating lock for include_tasks 41016 1727204177.86823: worker is 1 (out of 1 available) 41016 1727204177.86834: exiting _queue_task() for managed-node1/include_tasks 41016 1727204177.86846: done queuing things up, now waiting for results queue to drain 41016 1727204177.86847: waiting for pending results... 41016 1727204177.87196: running TaskExecutor() for managed-node1/TASK: Include the task 'el_repo_setup.yml' 41016 1727204177.87480: in run() - task 028d2410-947f-12d5-0ec4-000000000006 41016 1727204177.87527: variable 'ansible_search_path' from source: unknown 41016 1727204177.87569: calling self._execute() 41016 1727204177.87672: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204177.87883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204177.87886: variable 'omit' from source: magic vars 41016 1727204177.87973: _execute() done 41016 1727204177.88040: dumping result to json 41016 1727204177.88065: done dumping result, returning 41016 1727204177.88077: done running TaskExecutor() for managed-node1/TASK: Include the task 'el_repo_setup.yml' [028d2410-947f-12d5-0ec4-000000000006] 41016 1727204177.88114: sending task result for task 028d2410-947f-12d5-0ec4-000000000006 41016 1727204177.88300: no more pending results, returning what we have 41016 1727204177.88307: in VariableManager get_vars() 41016 1727204177.88347: Calling all_inventory to load vars for managed-node1 41016 1727204177.88350: Calling groups_inventory to load vars for managed-node1 41016 1727204177.88354: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204177.88368: Calling all_plugins_play to load vars for managed-node1 41016 1727204177.88372: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204177.88377: Calling groups_plugins_play to load vars for managed-node1 41016 1727204177.88649: done sending task result for task 028d2410-947f-12d5-0ec4-000000000006 41016 1727204177.88653: WORKER PROCESS EXITING 41016 1727204177.88677: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204177.89110: done with get_vars() 41016 1727204177.89118: variable 'ansible_search_path' from source: unknown 41016 1727204177.89132: we have included files to process 41016 1727204177.89133: generating all_blocks data 41016 1727204177.89135: done generating all_blocks data 41016 1727204177.89136: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 41016 1727204177.89137: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 41016 1727204177.89140: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 41016 1727204177.90055: in VariableManager get_vars() 41016 1727204177.90079: done with get_vars() 41016 1727204177.90092: done processing included file 41016 1727204177.90094: iterating over new_blocks loaded from include file 41016 1727204177.90096: in VariableManager get_vars() 41016 1727204177.90106: done with get_vars() 41016 1727204177.90108: filtering new block on tags 41016 1727204177.90122: done filtering new block on tags 41016 1727204177.90125: in VariableManager get_vars() 41016 1727204177.90135: done with get_vars() 41016 1727204177.90137: filtering new block on tags 41016 1727204177.90152: done filtering new block on tags 41016 1727204177.90155: in VariableManager get_vars() 41016 1727204177.90164: done with get_vars() 41016 1727204177.90165: filtering new block on tags 41016 1727204177.90183: done filtering new block on tags 41016 1727204177.90185: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node1 41016 1727204177.90191: extending task lists for all hosts with included blocks 41016 1727204177.90240: done extending task lists 41016 1727204177.90241: done processing included files 41016 1727204177.90242: results queue empty 41016 1727204177.90243: checking for any_errors_fatal 41016 1727204177.90244: done checking for any_errors_fatal 41016 1727204177.90245: checking for max_fail_percentage 41016 1727204177.90246: done checking for max_fail_percentage 41016 1727204177.90247: checking to see if all hosts have failed and the running result is not ok 41016 1727204177.90248: done checking to see if all hosts have failed 41016 1727204177.90248: getting the remaining hosts for this loop 41016 1727204177.90249: done getting the remaining hosts for this loop 41016 1727204177.90252: getting the next task for host managed-node1 41016 1727204177.90256: done getting next task for host managed-node1 41016 1727204177.90258: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 41016 1727204177.90260: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204177.90262: getting variables 41016 1727204177.90263: in VariableManager get_vars() 41016 1727204177.90271: Calling all_inventory to load vars for managed-node1 41016 1727204177.90274: Calling groups_inventory to load vars for managed-node1 41016 1727204177.90278: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204177.90287: Calling all_plugins_play to load vars for managed-node1 41016 1727204177.90291: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204177.90294: Calling groups_plugins_play to load vars for managed-node1 41016 1727204177.90458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204177.90643: done with get_vars() 41016 1727204177.90652: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 14:56:17 -0400 (0:00:00.046) 0:00:01.583 ***** 41016 1727204177.90726: entering _queue_task() for managed-node1/setup 41016 1727204177.91158: worker is 1 (out of 1 available) 41016 1727204177.91167: exiting _queue_task() for managed-node1/setup 41016 1727204177.91180: done queuing things up, now waiting for results queue to drain 41016 1727204177.91181: waiting for pending results... 41016 1727204177.91333: running TaskExecutor() for managed-node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 41016 1727204177.91457: in run() - task 028d2410-947f-12d5-0ec4-0000000000d0 41016 1727204177.91474: variable 'ansible_search_path' from source: unknown 41016 1727204177.91483: variable 'ansible_search_path' from source: unknown 41016 1727204177.91531: calling self._execute() 41016 1727204177.91627: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204177.91631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204177.91634: variable 'omit' from source: magic vars 41016 1727204177.92833: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41016 1727204177.95182: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41016 1727204177.95186: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41016 1727204177.95208: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41016 1727204177.95256: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41016 1727204177.95293: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41016 1727204177.95432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204177.95465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204177.95508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204177.95556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204177.95573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204177.95750: variable 'ansible_facts' from source: unknown 41016 1727204177.95821: variable 'network_test_required_facts' from source: task vars 41016 1727204177.95866: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 41016 1727204177.95879: variable 'omit' from source: magic vars 41016 1727204177.95919: variable 'omit' from source: magic vars 41016 1727204177.95956: variable 'omit' from source: magic vars 41016 1727204177.96084: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204177.96087: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204177.96089: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204177.96092: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204177.96094: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204177.96099: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204177.96107: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204177.96115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204177.96215: Set connection var ansible_shell_executable to /bin/sh 41016 1727204177.96227: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204177.96237: Set connection var ansible_shell_type to sh 41016 1727204177.96290: Set connection var ansible_timeout to 10 41016 1727204177.96306: Set connection var ansible_pipelining to False 41016 1727204177.96318: Set connection var ansible_connection to ssh 41016 1727204177.96343: variable 'ansible_shell_executable' from source: unknown 41016 1727204177.96482: variable 'ansible_connection' from source: unknown 41016 1727204177.96485: variable 'ansible_module_compression' from source: unknown 41016 1727204177.96487: variable 'ansible_shell_type' from source: unknown 41016 1727204177.96489: variable 'ansible_shell_executable' from source: unknown 41016 1727204177.96491: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204177.96493: variable 'ansible_pipelining' from source: unknown 41016 1727204177.96495: variable 'ansible_timeout' from source: unknown 41016 1727204177.96497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204177.96623: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 41016 1727204177.96640: variable 'omit' from source: magic vars 41016 1727204177.96650: starting attempt loop 41016 1727204177.96659: running the handler 41016 1727204177.96680: _low_level_execute_command(): starting 41016 1727204177.96693: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204177.97705: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204177.97720: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204177.97735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204177.97796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204177.97855: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204177.97930: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204177.98155: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204178.00590: stdout chunk (state=3): >>>/root <<< 41016 1727204178.00632: stdout chunk (state=3): >>><<< 41016 1727204178.00738: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204178.00853: stderr chunk (state=3): >>><<< 41016 1727204178.00856: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204178.00891: _low_level_execute_command(): starting 41016 1727204178.00901: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204178.0087364-41254-37210591853785 `" && echo ansible-tmp-1727204178.0087364-41254-37210591853785="` echo /root/.ansible/tmp/ansible-tmp-1727204178.0087364-41254-37210591853785 `" ) && sleep 0' 41016 1727204178.02229: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204178.02259: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204178.02277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204178.02298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204178.02363: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204178.02657: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204178.02729: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204178.05615: stdout chunk (state=3): >>>ansible-tmp-1727204178.0087364-41254-37210591853785=/root/.ansible/tmp/ansible-tmp-1727204178.0087364-41254-37210591853785 <<< 41016 1727204178.05841: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204178.05844: stdout chunk (state=3): >>><<< 41016 1727204178.05847: stderr chunk (state=3): >>><<< 41016 1727204178.05862: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204178.0087364-41254-37210591853785=/root/.ansible/tmp/ansible-tmp-1727204178.0087364-41254-37210591853785 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204178.06038: variable 'ansible_module_compression' from source: unknown 41016 1727204178.06095: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-410168h8uvyln/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 41016 1727204178.06320: variable 'ansible_facts' from source: unknown 41016 1727204178.06796: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204178.0087364-41254-37210591853785/AnsiballZ_setup.py 41016 1727204178.07043: Sending initial data 41016 1727204178.07046: Sent initial data (153 bytes) 41016 1727204178.08161: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 41016 1727204178.08292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204178.08415: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204178.08486: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204178.08606: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204178.10377: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204178.10454: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204178.10607: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmpsc_4v8ay /root/.ansible/tmp/ansible-tmp-1727204178.0087364-41254-37210591853785/AnsiballZ_setup.py <<< 41016 1727204178.10665: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204178.0087364-41254-37210591853785/AnsiballZ_setup.py" <<< 41016 1727204178.10680: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmpsc_4v8ay" to remote "/root/.ansible/tmp/ansible-tmp-1727204178.0087364-41254-37210591853785/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204178.0087364-41254-37210591853785/AnsiballZ_setup.py" <<< 41016 1727204178.13439: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204178.13544: stderr chunk (state=3): >>><<< 41016 1727204178.13548: stdout chunk (state=3): >>><<< 41016 1727204178.13585: done transferring module to remote 41016 1727204178.13588: _low_level_execute_command(): starting 41016 1727204178.13596: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204178.0087364-41254-37210591853785/ /root/.ansible/tmp/ansible-tmp-1727204178.0087364-41254-37210591853785/AnsiballZ_setup.py && sleep 0' 41016 1727204178.15004: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204178.15096: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204178.15189: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204178.17159: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204178.17194: stderr chunk (state=3): >>><<< 41016 1727204178.17296: stdout chunk (state=3): >>><<< 41016 1727204178.17406: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204178.17414: _low_level_execute_command(): starting 41016 1727204178.17417: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204178.0087364-41254-37210591853785/AnsiballZ_setup.py && sleep 0' 41016 1727204178.18473: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204178.18683: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 41016 1727204178.18698: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204178.18802: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204178.18851: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204178.19030: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204178.21429: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 41016 1727204178.21443: stdout chunk (state=3): >>>import _imp # builtin <<< 41016 1727204178.21571: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # <<< 41016 1727204178.21594: stdout chunk (state=3): >>>import '_weakref' # import '_io' # import 'marshal' # <<< 41016 1727204178.21619: stdout chunk (state=3): >>>import 'posix' # <<< 41016 1727204178.21647: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 41016 1727204178.21668: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 41016 1727204178.21722: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 41016 1727204178.21737: stdout chunk (state=3): >>>import '_codecs' # <<< 41016 1727204178.21759: stdout chunk (state=3): >>>import 'codecs' # <<< 41016 1727204178.21793: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 41016 1727204178.21873: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee72184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee71e7b30> <<< 41016 1727204178.21879: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 41016 1727204178.21882: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee721aa50> <<< 41016 1727204178.21904: stdout chunk (state=3): >>>import '_signal' # <<< 41016 1727204178.21946: stdout chunk (state=3): >>>import '_abc' # <<< 41016 1727204178.21997: stdout chunk (state=3): >>>import 'abc' # <<< 41016 1727204178.22002: stdout chunk (state=3): >>>import 'io' # <<< 41016 1727204178.22031: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 41016 1727204178.22107: stdout chunk (state=3): >>>import '_collections_abc' # <<< 41016 1727204178.22193: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 41016 1727204178.22248: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' <<< 41016 1727204178.22284: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6fc9130> <<< 41016 1727204178.22346: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 41016 1727204178.22480: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6fca060> <<< 41016 1727204178.22502: stdout chunk (state=3): >>>import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 41016 1727204178.22851: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 41016 1727204178.22883: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 41016 1727204178.22898: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 41016 1727204178.22955: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 41016 1727204178.22959: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 41016 1727204178.23044: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 41016 1727204178.23061: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee7007f50> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 41016 1727204178.23064: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 41016 1727204178.23110: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee701c0e0> <<< 41016 1727204178.23130: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 41016 1727204178.23154: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 41016 1727204178.23270: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 41016 1727204178.23293: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee703f980> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 41016 1727204178.23353: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee703ff50> import '_collections' # <<< 41016 1727204178.23371: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee701fc20> import '_functools' # <<< 41016 1727204178.23406: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee701d340> <<< 41016 1727204178.23508: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee7005100> <<< 41016 1727204178.23541: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 41016 1727204178.23632: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 41016 1727204178.23635: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 41016 1727204178.24099: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee7063950> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee7062570> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee701e210> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee7060d70> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee7090950> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee7004380> <<< 41016 1727204178.24124: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee7090e00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee7090cb0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee70910a0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee7002ea0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee7091760> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee7091460> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee7092660> import 'importlib.util' # import 'runpy' # <<< 41016 1727204178.24154: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 41016 1727204178.24192: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 41016 1727204178.24230: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee70ac860> <<< 41016 1727204178.24254: stdout chunk (state=3): >>>import 'errno' # <<< 41016 1727204178.24285: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 41016 1727204178.24354: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee70adfa0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 41016 1727204178.24493: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee70aee40> <<< 41016 1727204178.24508: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee70af4a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee70ae390> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee70aff20> <<< 41016 1727204178.24532: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee70af650> <<< 41016 1727204178.24570: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee7092690> <<< 41016 1727204178.24597: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 41016 1727204178.24699: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 41016 1727204178.24718: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee6dafda0> <<< 41016 1727204178.24782: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 41016 1727204178.24862: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee6dd8800> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6dd8560> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee6dd8830> <<< 41016 1727204178.24876: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 41016 1727204178.25040: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 41016 1727204178.25043: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee6dd9160> <<< 41016 1727204178.25222: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 41016 1727204178.25225: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee6dd9ac0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6dd8a10> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6dadf40> <<< 41016 1727204178.25248: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 41016 1727204178.25268: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 41016 1727204178.25307: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 41016 1727204178.25340: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6ddae70> <<< 41016 1727204178.25345: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6dd9940> <<< 41016 1727204178.25467: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee7092d80> <<< 41016 1727204178.25471: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 41016 1727204178.25490: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 41016 1727204178.25494: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 41016 1727204178.25527: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 41016 1727204178.25550: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6e071d0> <<< 41016 1727204178.25633: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 41016 1727204178.25661: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 41016 1727204178.25708: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 41016 1727204178.25718: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6e2b560> <<< 41016 1727204178.25734: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 41016 1727204178.25787: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 41016 1727204178.25840: stdout chunk (state=3): >>>import 'ntpath' # <<< 41016 1727204178.25884: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6e88350> <<< 41016 1727204178.25897: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 41016 1727204178.25924: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 41016 1727204178.25950: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 41016 1727204178.25995: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 41016 1727204178.26085: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6e8aab0> <<< 41016 1727204178.26162: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6e88470> <<< 41016 1727204178.26218: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6e51340> <<< 41016 1727204178.26263: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6729490> <<< 41016 1727204178.26266: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6e2a360> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6ddbda0> <<< 41016 1727204178.26458: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 41016 1727204178.26678: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f5ee6729730> <<< 41016 1727204178.26994: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_9id4ec3g/ansible_setup_payload.zip' # zipimport: zlib available <<< 41016 1727204178.27199: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.27231: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 41016 1727204178.27252: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 41016 1727204178.27474: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 41016 1727204178.27516: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee678f230> import '_typing' # <<< 41016 1727204178.27905: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6772120> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6771280> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # <<< 41016 1727204178.27931: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.30032: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.31271: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 41016 1727204178.31467: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee678d100> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 41016 1727204178.31515: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee67c2b40> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee67c28d0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee67c21e0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 41016 1727204178.31552: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee67c2cf0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee678fec0> <<< 41016 1727204178.31593: stdout chunk (state=3): >>>import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee67c3890> <<< 41016 1727204178.31716: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee67c3ad0> <<< 41016 1727204178.31719: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 41016 1727204178.31731: stdout chunk (state=3): >>>import '_locale' # <<< 41016 1727204178.31768: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee67c3f50> import 'pwd' # <<< 41016 1727204178.31805: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 41016 1727204178.31910: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 41016 1727204178.31914: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee662ddf0> <<< 41016 1727204178.31916: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee662fa10> <<< 41016 1727204178.32058: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 41016 1727204178.32062: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6630410> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 41016 1727204178.32064: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6631310> <<< 41016 1727204178.32118: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 41016 1727204178.32284: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 41016 1727204178.32288: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6633fe0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee6ddade0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee66322a0> <<< 41016 1727204178.32290: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 41016 1727204178.32328: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 41016 1727204178.32685: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 41016 1727204178.32689: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee663bef0> <<< 41016 1727204178.32715: stdout chunk (state=3): >>>import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee663a9f0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee663a750> <<< 41016 1727204178.32738: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 41016 1727204178.32854: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee663ac90> <<< 41016 1727204178.32899: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee66327b0> <<< 41016 1727204178.32930: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee667ff80> <<< 41016 1727204178.32978: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6680200> <<< 41016 1727204178.33233: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 41016 1727204178.33236: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee6681d60> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6681b20> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 41016 1727204178.33252: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee66842c0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6682450> <<< 41016 1727204178.33268: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 41016 1727204178.33329: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 41016 1727204178.33483: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6687aa0> <<< 41016 1727204178.33705: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6684470> <<< 41016 1727204178.33731: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee6688920> <<< 41016 1727204178.33874: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee6688950> <<< 41016 1727204178.33881: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee6688dd0> <<< 41016 1727204178.33884: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee66804d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 41016 1727204178.33993: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 41016 1727204178.34133: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee65143e0> <<< 41016 1727204178.34253: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so'<<< 41016 1727204178.34318: stdout chunk (state=3): >>> # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee65159a0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee668ab70><<< 41016 1727204178.34390: stdout chunk (state=3): >>> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 41016 1727204178.34403: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee668bf20><<< 41016 1727204178.34471: stdout chunk (state=3): >>> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee668a780> # zipimport: zlib available <<< 41016 1727204178.34563: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 41016 1727204178.34586: stdout chunk (state=3): >>> # zipimport: zlib available <<< 41016 1727204178.34801: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.34838: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.34888: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 41016 1727204178.34916: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.35004: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 41016 1727204178.35199: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.35299: stdout chunk (state=3): >>> <<< 41016 1727204178.35400: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.35538: stdout chunk (state=3): >>> <<< 41016 1727204178.36298: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.36742: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 41016 1727204178.36760: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 41016 1727204178.36857: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee6519910> <<< 41016 1727204178.36909: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 41016 1727204178.36983: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee651a6f0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6515c70> <<< 41016 1727204178.36991: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 41016 1727204178.37005: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.37027: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.37292: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 41016 1727204178.37302: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.37433: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 41016 1727204178.37437: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee651a7e0> <<< 41016 1727204178.37439: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.38209: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.38760: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.38778: stdout chunk (state=3): >>> <<< 41016 1727204178.38901: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.38923: stdout chunk (state=3): >>> <<< 41016 1727204178.39059: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 41016 1727204178.39158: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.warnings' # <<< 41016 1727204178.39182: stdout chunk (state=3): >>> <<< 41016 1727204178.39198: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.39380: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.39455: stdout chunk (state=3): >>> import 'ansible.module_utils.errors' # <<< 41016 1727204178.39484: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.39500: stdout chunk (state=3): >>> <<< 41016 1727204178.39572: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.39591: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 41016 1727204178.39650: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.39678: stdout chunk (state=3): >>> <<< 41016 1727204178.39795: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available<<< 41016 1727204178.40183: stdout chunk (state=3): >>> <<< 41016 1727204178.40198: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.40621: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 41016 1727204178.40740: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 41016 1727204178.40780: stdout chunk (state=3): >>>import '_ast' # <<< 41016 1727204178.40906: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee651b890> <<< 41016 1727204178.40940: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.40955: stdout chunk (state=3): >>> <<< 41016 1727204178.41104: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.41209: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 41016 1727204178.41408: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available <<< 41016 1727204178.41469: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 41016 1727204178.41492: stdout chunk (state=3): >>> # zipimport: zlib available<<< 41016 1727204178.41518: stdout chunk (state=3): >>> <<< 41016 1727204178.41632: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 41016 1727204178.41654: stdout chunk (state=3): >>> <<< 41016 1727204178.41741: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.41753: stdout chunk (state=3): >>> <<< 41016 1727204178.41863: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 41016 1727204178.41945: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc'<<< 41016 1727204178.41968: stdout chunk (state=3): >>> <<< 41016 1727204178.42152: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee65261b0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6521010><<< 41016 1727204178.42184: stdout chunk (state=3): >>> <<< 41016 1727204178.42291: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 41016 1727204178.42358: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 41016 1727204178.42368: stdout chunk (state=3): >>> <<< 41016 1727204178.42465: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.42508: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.42529: stdout chunk (state=3): >>> <<< 41016 1727204178.42581: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py<<< 41016 1727204178.42732: stdout chunk (state=3): >>> # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 41016 1727204178.42827: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc'<<< 41016 1727204178.42944: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 41016 1727204178.42980: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee660ea80><<< 41016 1727204178.43000: stdout chunk (state=3): >>> <<< 41016 1727204178.43063: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee67ee750><<< 41016 1727204178.43085: stdout chunk (state=3): >>> <<< 41016 1727204178.43204: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6526030> <<< 41016 1727204178.43274: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6515e50> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available<<< 41016 1727204178.43296: stdout chunk (state=3): >>> <<< 41016 1727204178.43378: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common._utils' # <<< 41016 1727204178.43496: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # <<< 41016 1727204178.43525: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.43558: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.43587: stdout chunk (state=3): >>>import 'ansible.modules' # <<< 41016 1727204178.43817: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.43820: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 41016 1727204178.43846: stdout chunk (state=3): >>> <<< 41016 1727204178.43869: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.43918: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.43977: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.43998: stdout chunk (state=3): >>> <<< 41016 1727204178.44097: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 41016 1727204178.44123: stdout chunk (state=3): >>> <<< 41016 1727204178.44156: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 41016 1727204178.44181: stdout chunk (state=3): >>> <<< 41016 1727204178.44198: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.44298: stdout chunk (state=3): >>> <<< 41016 1727204178.44346: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.44463: stdout chunk (state=3): >>> <<< 41016 1727204178.44487: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.44510: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.44528: stdout chunk (state=3): >>> <<< 41016 1727204178.44576: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 41016 1727204178.44603: stdout chunk (state=3): >>> <<< 41016 1727204178.44686: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.44911: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.45001: stdout chunk (state=3): >>> <<< 41016 1727204178.45217: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.45301: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.45380: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py<<< 41016 1727204178.45401: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc'<<< 41016 1727204178.45505: stdout chunk (state=3): >>> # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 41016 1727204178.45550: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc'<<< 41016 1727204178.45577: stdout chunk (state=3): >>> <<< 41016 1727204178.45600: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee65b66c0> <<< 41016 1727204178.45654: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc'<<< 41016 1727204178.45809: stdout chunk (state=3): >>> <<< 41016 1727204178.45816: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 41016 1727204178.45840: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 41016 1727204178.45862: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee61dff80><<< 41016 1727204178.45882: stdout chunk (state=3): >>> <<< 41016 1727204178.45935: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so'<<< 41016 1727204178.46005: stdout chunk (state=3): >>> <<< 41016 1727204178.46029: stdout chunk (state=3): >>>import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee61e43e0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee65a3200><<< 41016 1727204178.46051: stdout chunk (state=3): >>> <<< 41016 1727204178.46212: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee65b71d0> <<< 41016 1727204178.46215: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee65b4da0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee65b49e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py<<< 41016 1727204178.46279: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc'<<< 41016 1727204178.46304: stdout chunk (state=3): >>> <<< 41016 1727204178.46329: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 41016 1727204178.46348: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc'<<< 41016 1727204178.46372: stdout chunk (state=3): >>> <<< 41016 1727204178.46396: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 41016 1727204178.46418: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc'<<< 41016 1727204178.46439: stdout chunk (state=3): >>> <<< 41016 1727204178.46471: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee61e73b0><<< 41016 1727204178.46541: stdout chunk (state=3): >>> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee61e6c60> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' <<< 41016 1727204178.46567: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee61e6e40> <<< 41016 1727204178.46583: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee61e6090> <<< 41016 1727204178.46599: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 41016 1727204178.47044: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee61e7500> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee6246000> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee61e7f50> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee65b4a70> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 41016 1727204178.47131: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.47228: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available <<< 41016 1727204178.47297: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.47360: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 41016 1727204178.47392: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 41016 1727204178.47669: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system' # <<< 41016 1727204178.47672: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # <<< 41016 1727204178.47675: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.47679: stdout chunk (state=3): >>> <<< 41016 1727204178.47736: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.47802: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 41016 1727204178.47833: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.47936: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.48072: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.48147: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.48261: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 41016 1727204178.48287: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # <<< 41016 1727204178.48318: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.49149: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.49955: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available<<< 41016 1727204178.50003: stdout chunk (state=3): >>> <<< 41016 1727204178.50043: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.50160: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.50189: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.50213: stdout chunk (state=3): >>> <<< 41016 1727204178.50295: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 41016 1727204178.50335: stdout chunk (state=3): >>> # zipimport: zlib available <<< 41016 1727204178.50354: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.50434: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 41016 1727204178.50437: stdout chunk (state=3): >>> # zipimport: zlib available <<< 41016 1727204178.50529: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.50622: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.system.dns' # <<< 41016 1727204178.50655: stdout chunk (state=3): >>> # zipimport: zlib available<<< 41016 1727204178.50669: stdout chunk (state=3): >>> <<< 41016 1727204178.50705: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.50758: stdout chunk (state=3): >>> <<< 41016 1727204178.50771: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 41016 1727204178.50788: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.50847: stdout chunk (state=3): >>> # zipimport: zlib available <<< 41016 1727204178.50899: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 41016 1727204178.50933: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.51011: stdout chunk (state=3): >>> <<< 41016 1727204178.51061: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.51214: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 41016 1727204178.51227: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 41016 1727204178.51303: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6246060> <<< 41016 1727204178.51319: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py<<< 41016 1727204178.51362: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 41016 1727204178.51562: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6246d50> <<< 41016 1727204178.51642: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available<<< 41016 1727204178.51656: stdout chunk (state=3): >>> <<< 41016 1727204178.51732: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.51840: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.system.lsb' # <<< 41016 1727204178.51866: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.52056: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.52282: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 41016 1727204178.52309: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.52397: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.52424: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.system.platform' # <<< 41016 1727204178.52459: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.52538: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.52620: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py<<< 41016 1727204178.52633: stdout chunk (state=3): >>> <<< 41016 1727204178.52735: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 41016 1727204178.52820: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so'<<< 41016 1727204178.52824: stdout chunk (state=3): >>> <<< 41016 1727204178.52928: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so'<<< 41016 1727204178.53098: stdout chunk (state=3): >>> import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee62823f0> <<< 41016 1727204178.53277: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee62731d0><<< 41016 1727204178.53288: stdout chunk (state=3): >>> <<< 41016 1727204178.53299: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # <<< 41016 1727204178.53305: stdout chunk (state=3): >>> <<< 41016 1727204178.53336: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.53341: stdout chunk (state=3): >>> <<< 41016 1727204178.53445: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.53451: stdout chunk (state=3): >>> <<< 41016 1727204178.53538: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 41016 1727204178.53543: stdout chunk (state=3): >>> <<< 41016 1727204178.53568: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.53573: stdout chunk (state=3): >>> <<< 41016 1727204178.53719: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.53722: stdout chunk (state=3): >>> <<< 41016 1727204178.53856: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.53861: stdout chunk (state=3): >>> <<< 41016 1727204178.54055: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.54299: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 41016 1727204178.54328: stdout chunk (state=3): >>> # zipimport: zlib available<<< 41016 1727204178.54453: stdout chunk (state=3): >>> # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 41016 1727204178.54655: stdout chunk (state=3): >>> <<< 41016 1727204178.54658: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 41016 1727204178.54771: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee6295eb0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee62733b0><<< 41016 1727204178.54802: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available <<< 41016 1727204178.54827: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 41016 1727204178.54846: stdout chunk (state=3): >>> # zipimport: zlib available<<< 41016 1727204178.54915: stdout chunk (state=3): >>> # zipimport: zlib available <<< 41016 1727204178.54967: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 41016 1727204178.55030: stdout chunk (state=3): >>> # zipimport: zlib available <<< 41016 1727204178.55280: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.55398: stdout chunk (state=3): >>> <<< 41016 1727204178.55566: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 41016 1727204178.55595: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.55756: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.55770: stdout chunk (state=3): >>> <<< 41016 1727204178.55925: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.55942: stdout chunk (state=3): >>> <<< 41016 1727204178.56015: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.56020: stdout chunk (state=3): >>> <<< 41016 1727204178.56107: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 41016 1727204178.56291: stdout chunk (state=3): >>> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 41016 1727204178.56411: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.56503: stdout chunk (state=3): >>> <<< 41016 1727204178.56674: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # <<< 41016 1727204178.56703: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 41016 1727204178.56729: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.56935: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.56951: stdout chunk (state=3): >>> <<< 41016 1727204178.57307: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 41016 1727204178.58228: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.58231: stdout chunk (state=3): >>> <<< 41016 1727204178.59125: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 41016 1727204178.59171: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 41016 1727204178.59523: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # <<< 41016 1727204178.59525: stdout chunk (state=3): >>> <<< 41016 1727204178.59549: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.59719: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.59724: stdout chunk (state=3): >>> <<< 41016 1727204178.59878: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 41016 1727204178.59885: stdout chunk (state=3): >>> <<< 41016 1727204178.59913: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.59916: stdout chunk (state=3): >>> <<< 41016 1727204178.60176: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.60182: stdout chunk (state=3): >>> <<< 41016 1727204178.60454: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 41016 1727204178.60488: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.60519: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.60528: stdout chunk (state=3): >>> <<< 41016 1727204178.60564: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 41016 1727204178.60699: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.base' # <<< 41016 1727204178.60725: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.60880: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.60900: stdout chunk (state=3): >>> <<< 41016 1727204178.61044: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.61098: stdout chunk (state=3): >>> <<< 41016 1727204178.61408: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.61431: stdout chunk (state=3): >>> <<< 41016 1727204178.61745: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 41016 1727204178.61769: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.network.aix' # <<< 41016 1727204178.61801: stdout chunk (state=3): >>> # zipimport: zlib available<<< 41016 1727204178.61806: stdout chunk (state=3): >>> <<< 41016 1727204178.61894: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.61923: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 41016 1727204178.61950: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.61996: stdout chunk (state=3): >>> # zipimport: zlib available<<< 41016 1727204178.61999: stdout chunk (state=3): >>> <<< 41016 1727204178.62041: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 41016 1727204178.62044: stdout chunk (state=3): >>> <<< 41016 1727204178.62175: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 41016 1727204178.62182: stdout chunk (state=3): >>> <<< 41016 1727204178.62290: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 41016 1727204178.62296: stdout chunk (state=3): >>> <<< 41016 1727204178.62322: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.62331: stdout chunk (state=3): >>> <<< 41016 1727204178.62369: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.62373: stdout chunk (state=3): >>> <<< 41016 1727204178.62412: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 41016 1727204178.62416: stdout chunk (state=3): >>> <<< 41016 1727204178.62445: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.62448: stdout chunk (state=3): >>> <<< 41016 1727204178.62538: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.62543: stdout chunk (state=3): >>> <<< 41016 1727204178.62638: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available<<< 41016 1727204178.62730: stdout chunk (state=3): >>> # zipimport: zlib available <<< 41016 1727204178.62822: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 41016 1727204178.62849: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.63336: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.63809: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 41016 1727204178.63992: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # <<< 41016 1727204178.64025: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.64090: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.64150: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 41016 1727204178.64182: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.64236: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.64286: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.network.netbsd' # <<< 41016 1727204178.64292: stdout chunk (state=3): >>> <<< 41016 1727204178.64319: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.64327: stdout chunk (state=3): >>> <<< 41016 1727204178.64416: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # <<< 41016 1727204178.64425: stdout chunk (state=3): >>> <<< 41016 1727204178.64444: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.64454: stdout chunk (state=3): >>> <<< 41016 1727204178.64572: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.64699: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.network.sunos' # <<< 41016 1727204178.64734: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.64737: stdout chunk (state=3): >>> <<< 41016 1727204178.64765: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.64782: stdout chunk (state=3): >>> <<< 41016 1727204178.64790: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual' # <<< 41016 1727204178.64823: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.64899: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.64904: stdout chunk (state=3): >>> <<< 41016 1727204178.64966: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 41016 1727204178.65004: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.65069: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 41016 1727204178.65149: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.65230: stdout chunk (state=3): >>> <<< 41016 1727204178.65285: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.65411: stdout chunk (state=3): >>> # zipimport: zlib available<<< 41016 1727204178.65420: stdout chunk (state=3): >>> <<< 41016 1727204178.65533: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 41016 1727204178.65540: stdout chunk (state=3): >>> <<< 41016 1727204178.65560: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # <<< 41016 1727204178.65567: stdout chunk (state=3): >>> <<< 41016 1727204178.65585: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 41016 1727204178.65611: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.65619: stdout chunk (state=3): >>> <<< 41016 1727204178.65696: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204178.65702: stdout chunk (state=3): >>> <<< 41016 1727204178.65995: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 41016 1727204178.66136: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.66457: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 41016 1727204178.66482: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.66556: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.66626: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 41016 1727204178.66653: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.66726: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.66793: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 41016 1727204178.66815: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.66939: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.67058: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 41016 1727204178.67070: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # <<< 41016 1727204178.67095: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.67227: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.67361: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 41016 1727204178.67383: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # <<< 41016 1727204178.67387: stdout chunk (state=3): >>>import 'ansible.module_utils.facts' # <<< 41016 1727204178.67513: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204178.67811: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 41016 1727204178.68002: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee6093110> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee60914f0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6091070> <<< 41016 1727204178.69292: stdout chunk (state=3): >>> <<< 41016 1727204178.69325: stdout chunk (state=3): >>>{"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEa<<< 41016 1727204178.69355: stdout chunk (state=3): >>>XgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribut<<< 41016 1727204178.69363: stdout chunk (state=3): >>>ion_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "18", "epoch": "1727204178", "epoch_int": "1727204178", "date": "2024-09-24", "time": "14:56:18", "iso8601_micro": "2024-09-24T18:56:18.690138Z", "iso8601": "2024-09-24T18:56:18Z", "iso8601_basic": "20240924T145618690138", "iso8601_basic_short": "20240924T145618", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}}<<< 41016 1727204178.69401: stdout chunk (state=3): >>> <<< 41016 1727204178.70482: stdout chunk (state=3): >>># clear sys.path_importer_cache<<< 41016 1727204178.70507: stdout chunk (state=3): >>> # clear sys.path_hooks<<< 41016 1727204178.70513: stdout chunk (state=3): >>> # clear builtins._<<< 41016 1727204178.70535: stdout chunk (state=3): >>> # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value<<< 41016 1727204178.70548: stdout chunk (state=3): >>> # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib<<< 41016 1727204178.70569: stdout chunk (state=3): >>> # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix<<< 41016 1727204178.70606: stdout chunk (state=3): >>> # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 <<< 41016 1727204178.70612: stdout chunk (state=3): >>># cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os<<< 41016 1727204178.70639: stdout chunk (state=3): >>> # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib<<< 41016 1727204178.70667: stdout chunk (state=3): >>> # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg <<< 41016 1727204178.70693: stdout chunk (state=3): >>># cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno<<< 41016 1727204178.70727: stdout chunk (state=3): >>> # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib<<< 41016 1727204178.70738: stdout chunk (state=3): >>> # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading<<< 41016 1727204178.70758: stdout chunk (state=3): >>> # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib<<< 41016 1727204178.70783: stdout chunk (state=3): >>> # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil <<< 41016 1727204178.70825: stdout chunk (state=3): >>># destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select<<< 41016 1727204178.70836: stdout chunk (state=3): >>> # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token<<< 41016 1727204178.70855: stdout chunk (state=3): >>> # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime<<< 41016 1727204178.70876: stdout chunk (state=3): >>> # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128<<< 41016 1727204178.70909: stdout chunk (state=3): >>> # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text<<< 41016 1727204178.70926: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters<<< 41016 1727204178.70938: stdout chunk (state=3): >>> # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy<<< 41016 1727204178.70957: stdout chunk (state=3): >>> # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing<<< 41016 1727204178.70983: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters<<< 41016 1727204178.70999: stdout chunk (state=3): >>> # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4<<< 41016 1727204178.71023: stdout chunk (state=3): >>> # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file<<< 41016 1727204178.71038: stdout chunk (state=3): >>> # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils<<< 41016 1727204178.71066: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction<<< 41016 1727204178.71079: stdout chunk (state=3): >>> # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq<<< 41016 1727204178.71102: stdout chunk (state=3): >>> # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector <<< 41016 1727204178.71122: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils<<< 41016 1727204178.71147: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser<<< 41016 1727204178.71157: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl<<< 41016 1727204178.71179: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass<<< 41016 1727204178.71197: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux <<< 41016 1727204178.71230: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn<<< 41016 1727204178.71248: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme<<< 41016 1727204178.71267: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux<<< 41016 1727204178.71278: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts<<< 41016 1727204178.71301: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution<<< 41016 1727204178.71323: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr<<< 41016 1727204178.71341: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user<<< 41016 1727204178.71347: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly <<< 41016 1727204178.71370: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base<<< 41016 1727204178.71390: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi<<< 41016 1727204178.71416: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd<<< 41016 1727204178.71493: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 41016 1727204178.72681: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing <<< 41016 1727204178.72846: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser <<< 41016 1727204178.72930: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib <<< 41016 1727204178.72950: stdout chunk (state=3): >>># cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux <<< 41016 1727204178.72967: stdout chunk (state=3): >>># destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 41016 1727204178.73181: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 41016 1727204178.73193: stdout chunk (state=3): >>># destroy _collections <<< 41016 1727204178.73239: stdout chunk (state=3): >>># destroy platform # destroy _uuid <<< 41016 1727204178.73268: stdout chunk (state=3): >>># destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 41016 1727204178.73272: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 41016 1727204178.73486: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator <<< 41016 1727204178.73491: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 41016 1727204178.73494: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 41016 1727204178.73504: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 41016 1727204178.73528: stdout chunk (state=3): >>># destroy _hashlib <<< 41016 1727204178.73556: stdout chunk (state=3): >>># destroy _operator # destroy _sre # destroy _string # destroy re <<< 41016 1727204178.73585: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 41016 1727204178.73596: stdout chunk (state=3): >>># clear sys.audit hooks <<< 41016 1727204178.74044: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 41016 1727204178.74048: stdout chunk (state=3): >>><<< 41016 1727204178.74056: stderr chunk (state=3): >>><<< 41016 1727204178.74397: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee72184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee71e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee721aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6fc9130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6fca060> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee7007f50> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee701c0e0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee703f980> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee703ff50> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee701fc20> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee701d340> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee7005100> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee7063950> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee7062570> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee701e210> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee7060d70> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee7090950> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee7004380> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee7090e00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee7090cb0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee70910a0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee7002ea0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee7091760> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee7091460> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee7092660> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee70ac860> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee70adfa0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee70aee40> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee70af4a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee70ae390> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee70aff20> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee70af650> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee7092690> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee6dafda0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee6dd8800> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6dd8560> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee6dd8830> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee6dd9160> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee6dd9ac0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6dd8a10> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6dadf40> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6ddae70> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6dd9940> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee7092d80> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6e071d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6e2b560> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6e88350> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6e8aab0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6e88470> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6e51340> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6729490> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6e2a360> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6ddbda0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f5ee6729730> # zipimport: found 103 names in '/tmp/ansible_setup_payload_9id4ec3g/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee678f230> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6772120> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6771280> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee678d100> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee67c2b40> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee67c28d0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee67c21e0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee67c2cf0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee678fec0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee67c3890> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee67c3ad0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee67c3f50> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee662ddf0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee662fa10> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6630410> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6631310> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6633fe0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee6ddade0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee66322a0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee663bef0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee663a9f0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee663a750> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee663ac90> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee66327b0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee667ff80> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6680200> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee6681d60> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6681b20> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee66842c0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6682450> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6687aa0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6684470> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee6688920> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee6688950> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee6688dd0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee66804d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee65143e0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee65159a0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee668ab70> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee668bf20> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee668a780> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee6519910> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee651a6f0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6515c70> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee651a7e0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee651b890> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee65261b0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6521010> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee660ea80> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee67ee750> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6526030> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6515e50> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee65b66c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee61dff80> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee61e43e0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee65a3200> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee65b71d0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee65b4da0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee65b49e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee61e73b0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee61e6c60> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee61e6e40> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee61e6090> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee61e7500> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee6246000> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee61e7f50> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee65b4a70> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6246060> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6246d50> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee62823f0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee62731d0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee6295eb0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee62733b0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5ee6093110> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee60914f0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5ee6091070> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "18", "epoch": "1727204178", "epoch_int": "1727204178", "date": "2024-09-24", "time": "14:56:18", "iso8601_micro": "2024-09-24T18:56:18.690138Z", "iso8601": "2024-09-24T18:56:18Z", "iso8601_basic": "20240924T145618690138", "iso8601_basic_short": "20240924T145618", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 41016 1727204178.76107: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204178.0087364-41254-37210591853785/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204178.76114: _low_level_execute_command(): starting 41016 1727204178.76116: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204178.0087364-41254-37210591853785/ > /dev/null 2>&1 && sleep 0' 41016 1727204178.76143: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204178.76147: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204178.76150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204178.76236: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204178.76240: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204178.76242: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204178.76324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 3 <<< 41016 1727204178.79482: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204178.79486: stdout chunk (state=3): >>><<< 41016 1727204178.79488: stderr chunk (state=3): >>><<< 41016 1727204178.79490: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 3 debug2: Received exit status from master 0 41016 1727204178.79492: handler run complete 41016 1727204178.79611: variable 'ansible_facts' from source: unknown 41016 1727204178.79664: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204178.79937: variable 'ansible_facts' from source: unknown 41016 1727204178.80007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204178.80149: attempt loop complete, returning result 41016 1727204178.80153: _execute() done 41016 1727204178.80155: dumping result to json 41016 1727204178.80166: done dumping result, returning 41016 1727204178.80174: done running TaskExecutor() for managed-node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [028d2410-947f-12d5-0ec4-0000000000d0] 41016 1727204178.80179: sending task result for task 028d2410-947f-12d5-0ec4-0000000000d0 41016 1727204178.80650: done sending task result for task 028d2410-947f-12d5-0ec4-0000000000d0 41016 1727204178.80654: WORKER PROCESS EXITING ok: [managed-node1] 41016 1727204178.80773: no more pending results, returning what we have 41016 1727204178.80778: results queue empty 41016 1727204178.80779: checking for any_errors_fatal 41016 1727204178.80781: done checking for any_errors_fatal 41016 1727204178.80782: checking for max_fail_percentage 41016 1727204178.80784: done checking for max_fail_percentage 41016 1727204178.80785: checking to see if all hosts have failed and the running result is not ok 41016 1727204178.80786: done checking to see if all hosts have failed 41016 1727204178.80786: getting the remaining hosts for this loop 41016 1727204178.80788: done getting the remaining hosts for this loop 41016 1727204178.80792: getting the next task for host managed-node1 41016 1727204178.80802: done getting next task for host managed-node1 41016 1727204178.80804: ^ task is: TASK: Check if system is ostree 41016 1727204178.80807: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204178.80813: getting variables 41016 1727204178.80814: in VariableManager get_vars() 41016 1727204178.80844: Calling all_inventory to load vars for managed-node1 41016 1727204178.80847: Calling groups_inventory to load vars for managed-node1 41016 1727204178.80968: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204178.80980: Calling all_plugins_play to load vars for managed-node1 41016 1727204178.80983: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204178.80987: Calling groups_plugins_play to load vars for managed-node1 41016 1727204178.81247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204178.81601: done with get_vars() 41016 1727204178.81682: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 14:56:18 -0400 (0:00:00.911) 0:00:02.495 ***** 41016 1727204178.81905: entering _queue_task() for managed-node1/stat 41016 1727204178.82515: worker is 1 (out of 1 available) 41016 1727204178.82526: exiting _queue_task() for managed-node1/stat 41016 1727204178.82602: done queuing things up, now waiting for results queue to drain 41016 1727204178.82604: waiting for pending results... 41016 1727204178.82972: running TaskExecutor() for managed-node1/TASK: Check if system is ostree 41016 1727204178.83249: in run() - task 028d2410-947f-12d5-0ec4-0000000000d2 41016 1727204178.83326: variable 'ansible_search_path' from source: unknown 41016 1727204178.83329: variable 'ansible_search_path' from source: unknown 41016 1727204178.83362: calling self._execute() 41016 1727204178.83499: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204178.83504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204178.83580: variable 'omit' from source: magic vars 41016 1727204178.84772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41016 1727204178.85612: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41016 1727204178.85774: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41016 1727204178.86092: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41016 1727204178.86096: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41016 1727204178.86435: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41016 1727204178.86468: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41016 1727204178.86503: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204178.86607: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41016 1727204178.86852: Evaluated conditional (not __network_is_ostree is defined): True 41016 1727204178.87070: variable 'omit' from source: magic vars 41016 1727204178.87073: variable 'omit' from source: magic vars 41016 1727204178.87076: variable 'omit' from source: magic vars 41016 1727204178.87184: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204178.87220: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204178.87244: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204178.87265: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204178.87302: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204178.87338: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204178.87396: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204178.87411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204178.87625: Set connection var ansible_shell_executable to /bin/sh 41016 1727204178.87638: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204178.87648: Set connection var ansible_shell_type to sh 41016 1727204178.87656: Set connection var ansible_timeout to 10 41016 1727204178.87665: Set connection var ansible_pipelining to False 41016 1727204178.87726: Set connection var ansible_connection to ssh 41016 1727204178.87756: variable 'ansible_shell_executable' from source: unknown 41016 1727204178.87764: variable 'ansible_connection' from source: unknown 41016 1727204178.87771: variable 'ansible_module_compression' from source: unknown 41016 1727204178.87779: variable 'ansible_shell_type' from source: unknown 41016 1727204178.87843: variable 'ansible_shell_executable' from source: unknown 41016 1727204178.87846: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204178.87849: variable 'ansible_pipelining' from source: unknown 41016 1727204178.87851: variable 'ansible_timeout' from source: unknown 41016 1727204178.87859: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204178.88263: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 41016 1727204178.88280: variable 'omit' from source: magic vars 41016 1727204178.88292: starting attempt loop 41016 1727204178.88300: running the handler 41016 1727204178.88321: _low_level_execute_command(): starting 41016 1727204178.88333: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204178.89660: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204178.89837: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204178.89918: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204178.89947: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204178.89974: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204178.90106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204178.91957: stdout chunk (state=3): >>>/root <<< 41016 1727204178.92159: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204178.92163: stdout chunk (state=3): >>><<< 41016 1727204178.92165: stderr chunk (state=3): >>><<< 41016 1727204178.92168: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204178.92184: _low_level_execute_command(): starting 41016 1727204178.92194: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204178.9213998-41305-203616006487918 `" && echo ansible-tmp-1727204178.9213998-41305-203616006487918="` echo /root/.ansible/tmp/ansible-tmp-1727204178.9213998-41305-203616006487918 `" ) && sleep 0' 41016 1727204178.93567: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204178.93586: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204178.93607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204178.93766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204178.93835: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204178.93987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204178.96272: stdout chunk (state=3): >>>ansible-tmp-1727204178.9213998-41305-203616006487918=/root/.ansible/tmp/ansible-tmp-1727204178.9213998-41305-203616006487918 <<< 41016 1727204178.96341: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204178.96610: stderr chunk (state=3): >>><<< 41016 1727204178.96614: stdout chunk (state=3): >>><<< 41016 1727204178.96616: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204178.9213998-41305-203616006487918=/root/.ansible/tmp/ansible-tmp-1727204178.9213998-41305-203616006487918 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204178.96619: variable 'ansible_module_compression' from source: unknown 41016 1727204178.96665: ANSIBALLZ: Using lock for stat 41016 1727204178.96722: ANSIBALLZ: Acquiring lock 41016 1727204178.96730: ANSIBALLZ: Lock acquired: 140580610774880 41016 1727204178.96827: ANSIBALLZ: Creating module 41016 1727204179.12994: ANSIBALLZ: Writing module into payload 41016 1727204179.13102: ANSIBALLZ: Writing module 41016 1727204179.13130: ANSIBALLZ: Renaming module 41016 1727204179.13141: ANSIBALLZ: Done creating module 41016 1727204179.13160: variable 'ansible_facts' from source: unknown 41016 1727204179.13249: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204178.9213998-41305-203616006487918/AnsiballZ_stat.py 41016 1727204179.13467: Sending initial data 41016 1727204179.13480: Sent initial data (153 bytes) 41016 1727204179.14140: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204179.14170: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204179.14173: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204179.14246: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 41016 1727204179.16678: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204179.16751: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204179.16833: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmp79iepge3 /root/.ansible/tmp/ansible-tmp-1727204178.9213998-41305-203616006487918/AnsiballZ_stat.py <<< 41016 1727204179.16837: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204178.9213998-41305-203616006487918/AnsiballZ_stat.py" <<< 41016 1727204179.16929: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmp79iepge3" to remote "/root/.ansible/tmp/ansible-tmp-1727204178.9213998-41305-203616006487918/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204178.9213998-41305-203616006487918/AnsiballZ_stat.py" <<< 41016 1727204179.17792: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204179.17830: stderr chunk (state=3): >>><<< 41016 1727204179.17834: stdout chunk (state=3): >>><<< 41016 1727204179.17859: done transferring module to remote 41016 1727204179.17872: _low_level_execute_command(): starting 41016 1727204179.17877: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204178.9213998-41305-203616006487918/ /root/.ansible/tmp/ansible-tmp-1727204178.9213998-41305-203616006487918/AnsiballZ_stat.py && sleep 0' 41016 1727204179.18322: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204179.18325: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204179.18329: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 41016 1727204179.18332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204179.18370: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204179.18374: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204179.18460: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 41016 1727204179.21094: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204179.21119: stderr chunk (state=3): >>><<< 41016 1727204179.21122: stdout chunk (state=3): >>><<< 41016 1727204179.21136: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 41016 1727204179.21140: _low_level_execute_command(): starting 41016 1727204179.21144: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204178.9213998-41305-203616006487918/AnsiballZ_stat.py && sleep 0' 41016 1727204179.21552: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204179.21584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 41016 1727204179.21587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204179.21590: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 41016 1727204179.21592: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204179.21594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204179.21641: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204179.21647: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204179.21737: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 41016 1727204179.25117: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 41016 1727204179.25201: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 41016 1727204179.25252: stdout chunk (state=3): >>>import '_io' # <<< 41016 1727204179.25261: stdout chunk (state=3): >>>import 'marshal' # <<< 41016 1727204179.25335: stdout chunk (state=3): >>>import 'posix' # <<< 41016 1727204179.25374: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 41016 1727204179.25395: stdout chunk (state=3): >>># installing zipimport hook <<< 41016 1727204179.25527: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 41016 1727204179.25564: stdout chunk (state=3): >>>import '_codecs' # <<< 41016 1727204179.25594: stdout chunk (state=3): >>>import 'codecs' # <<< 41016 1727204179.25647: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 41016 1727204179.25677: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 41016 1727204179.25705: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4dbc4d0> <<< 41016 1727204179.25722: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4d8bb00> <<< 41016 1727204179.25753: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py <<< 41016 1727204179.25759: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 41016 1727204179.25803: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4dbea50> import '_signal' # <<< 41016 1727204179.25834: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 41016 1727204179.25857: stdout chunk (state=3): >>>import 'io' # <<< 41016 1727204179.25892: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 41016 1727204179.26115: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages <<< 41016 1727204179.26126: stdout chunk (state=3): >>>Processing global site-packages <<< 41016 1727204179.26146: stdout chunk (state=3): >>>Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 41016 1727204179.26195: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 41016 1727204179.26279: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4dcd130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 41016 1727204179.26297: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4dce060> <<< 41016 1727204179.26336: stdout chunk (state=3): >>>import 'site' # <<< 41016 1727204179.26373: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 41016 1727204179.27000: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4babf50> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 41016 1727204179.27003: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4bc00e0> <<< 41016 1727204179.27056: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 41016 1727204179.27082: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 41016 1727204179.27148: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 41016 1727204179.27207: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4be3950> <<< 41016 1727204179.27241: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4be3fe0> <<< 41016 1727204179.27339: stdout chunk (state=3): >>>import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4bc3bf0> <<< 41016 1727204179.27343: stdout chunk (state=3): >>>import '_functools' # <<< 41016 1727204179.27372: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4bc1340> <<< 41016 1727204179.27530: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4ba9100> <<< 41016 1727204179.27541: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 41016 1727204179.27574: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 41016 1727204179.27587: stdout chunk (state=3): >>>import '_sre' # <<< 41016 1727204179.27651: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 41016 1727204179.27671: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 41016 1727204179.27714: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4c078f0> <<< 41016 1727204179.27771: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4c06510> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4bc21e0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4c04c50> <<< 41016 1727204179.27849: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' <<< 41016 1727204179.27888: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4c34920> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4ba8380> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 41016 1727204179.27934: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b4c34dd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4c34c80> <<< 41016 1727204179.28024: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b4c35070> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4ba6ea0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 41016 1727204179.28043: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 41016 1727204179.28096: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 41016 1727204179.28111: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4c35730> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4c35430> import 'importlib.machinery' # <<< 41016 1727204179.28170: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4c36630> <<< 41016 1727204179.28204: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 41016 1727204179.28263: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 41016 1727204179.28307: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 41016 1727204179.28329: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4c50860> import 'errno' # <<< 41016 1727204179.28345: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 41016 1727204179.28511: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b4c51fa0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4c52e40> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b4c53470> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4c52390> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 41016 1727204179.28522: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 41016 1727204179.28567: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 41016 1727204179.28581: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b4c53ef0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4c53620> <<< 41016 1727204179.28651: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4c36690> <<< 41016 1727204179.28667: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 41016 1727204179.28700: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 41016 1727204179.28719: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 41016 1727204179.28746: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 41016 1727204179.28793: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 41016 1727204179.28814: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b49cfda0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py <<< 41016 1727204179.28848: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 41016 1727204179.28866: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b49f8890> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b49f85f0> <<< 41016 1727204179.28932: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 41016 1727204179.28943: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b49f88c0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 41016 1727204179.29031: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 41016 1727204179.29221: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b49f91f0> <<< 41016 1727204179.29439: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b49f9be0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b49f8aa0> <<< 41016 1727204179.29486: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b49cdf40> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 41016 1727204179.29504: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 41016 1727204179.29542: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 41016 1727204179.29577: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b49faf90> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b49f9a60> <<< 41016 1727204179.29602: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4c36d80> <<< 41016 1727204179.29637: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 41016 1727204179.29818: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 41016 1727204179.29909: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4a232f0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 41016 1727204179.30112: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4a476e0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 41016 1727204179.30180: stdout chunk (state=3): >>>import 'ntpath' # <<< 41016 1727204179.30220: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4aa84d0> <<< 41016 1727204179.30274: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 41016 1727204179.30280: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 41016 1727204179.30401: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 41016 1727204179.30490: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4aaac30> <<< 41016 1727204179.30600: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4aa85f0> <<< 41016 1727204179.30644: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4a714c0> <<< 41016 1727204179.30703: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4325550> <<< 41016 1727204179.30726: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4a464e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b49fbec0> <<< 41016 1727204179.30888: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 41016 1727204179.30992: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fd5b4a46ae0> <<< 41016 1727204179.31135: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_hbx12uac/ansible_stat_payload.zip' <<< 41016 1727204179.31303: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204179.31396: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204179.31407: stdout chunk (state=3): >>> <<< 41016 1727204179.31439: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py<<< 41016 1727204179.31457: stdout chunk (state=3): >>> <<< 41016 1727204179.31491: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 41016 1727204179.31551: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py<<< 41016 1727204179.31562: stdout chunk (state=3): >>> <<< 41016 1727204179.31677: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 41016 1727204179.31747: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 41016 1727204179.31750: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc'<<< 41016 1727204179.31765: stdout chunk (state=3): >>> import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b43772c0> <<< 41016 1727204179.31790: stdout chunk (state=3): >>>import '_typing' # <<< 41016 1727204179.32059: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b435a1b0><<< 41016 1727204179.32078: stdout chunk (state=3): >>> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4359310> <<< 41016 1727204179.32151: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible' # <<< 41016 1727204179.32168: stdout chunk (state=3): >>> # zipimport: zlib available <<< 41016 1727204179.32193: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204179.32225: stdout chunk (state=3): >>> # zipimport: zlib available<<< 41016 1727204179.32264: stdout chunk (state=3): >>> <<< 41016 1727204179.32290: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 41016 1727204179.32293: stdout chunk (state=3): >>> # zipimport: zlib available <<< 41016 1727204179.34603: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204179.36589: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 41016 1727204179.36607: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4375160> <<< 41016 1727204179.36636: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc'<<< 41016 1727204179.36681: stdout chunk (state=3): >>> # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 41016 1727204179.36706: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 41016 1727204179.36736: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc'<<< 41016 1727204179.36786: stdout chunk (state=3): >>> # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 41016 1727204179.36813: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 41016 1727204179.36822: stdout chunk (state=3): >>>import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b43a2c90> <<< 41016 1727204179.36922: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b43a2a20> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b43a2330> <<< 41016 1727204179.36956: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 41016 1727204179.36983: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc'<<< 41016 1727204179.37037: stdout chunk (state=3): >>> <<< 41016 1727204179.37061: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b43a2d50> <<< 41016 1727204179.37064: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4377ce0><<< 41016 1727204179.37092: stdout chunk (state=3): >>> import 'atexit' # <<< 41016 1727204179.37130: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so'<<< 41016 1727204179.37152: stdout chunk (state=3): >>> # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b43a38f0><<< 41016 1727204179.37190: stdout chunk (state=3): >>> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so'<<< 41016 1727204179.37210: stdout chunk (state=3): >>> # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so'<<< 41016 1727204179.37217: stdout chunk (state=3): >>> <<< 41016 1727204179.37249: stdout chunk (state=3): >>>import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b43a3b00> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py<<< 41016 1727204179.37253: stdout chunk (state=3): >>> <<< 41016 1727204179.37340: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc'<<< 41016 1727204179.37370: stdout chunk (state=3): >>> import '_locale' # <<< 41016 1727204179.37441: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b43a3f20><<< 41016 1727204179.37464: stdout chunk (state=3): >>> import 'pwd' # <<< 41016 1727204179.37469: stdout chunk (state=3): >>> <<< 41016 1727204179.37542: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc'<<< 41016 1727204179.37547: stdout chunk (state=3): >>> <<< 41016 1727204179.37611: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b420dcd0> <<< 41016 1727204179.37646: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so'<<< 41016 1727204179.37666: stdout chunk (state=3): >>> # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so'<<< 41016 1727204179.37710: stdout chunk (state=3): >>> import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b420f8f0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py<<< 41016 1727204179.37713: stdout chunk (state=3): >>> <<< 41016 1727204179.37742: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc'<<< 41016 1727204179.37746: stdout chunk (state=3): >>> <<< 41016 1727204179.37830: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b42142f0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py<<< 41016 1727204179.37835: stdout chunk (state=3): >>> <<< 41016 1727204179.37888: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc'<<< 41016 1727204179.37925: stdout chunk (state=3): >>> import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4215460> <<< 41016 1727204179.38017: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc'<<< 41016 1727204179.38019: stdout chunk (state=3): >>> <<< 41016 1727204179.38049: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py<<< 41016 1727204179.38054: stdout chunk (state=3): >>> <<< 41016 1727204179.38072: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 41016 1727204179.38161: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4217f50><<< 41016 1727204179.38166: stdout chunk (state=3): >>> <<< 41016 1727204179.38222: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so'<<< 41016 1727204179.38236: stdout chunk (state=3): >>> # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b421c0b0><<< 41016 1727204179.38273: stdout chunk (state=3): >>> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4216240><<< 41016 1727204179.38281: stdout chunk (state=3): >>> <<< 41016 1727204179.38307: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py<<< 41016 1727204179.38363: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 41016 1727204179.38399: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 41016 1727204179.38421: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 41016 1727204179.38453: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py<<< 41016 1727204179.38513: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc'<<< 41016 1727204179.38517: stdout chunk (state=3): >>> <<< 41016 1727204179.38544: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py<<< 41016 1727204179.38552: stdout chunk (state=3): >>> <<< 41016 1727204179.38580: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 41016 1727204179.38618: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b421fe90><<< 41016 1727204179.38623: stdout chunk (state=3): >>> import '_tokenize' # <<< 41016 1727204179.38745: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b421e960><<< 41016 1727204179.38792: stdout chunk (state=3): >>> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b421e6c0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 41016 1727204179.38798: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc'<<< 41016 1727204179.38912: stdout chunk (state=3): >>> <<< 41016 1727204179.38931: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b421ec30> <<< 41016 1727204179.38979: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4216750> <<< 41016 1727204179.39035: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 41016 1727204179.39064: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 41016 1727204179.39086: stdout chunk (state=3): >>>import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b4263ef0> <<< 41016 1727204179.39123: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' <<< 41016 1727204179.39144: stdout chunk (state=3): >>>import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b42640e0> <<< 41016 1727204179.39203: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 41016 1727204179.39240: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py <<< 41016 1727204179.39320: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so'<<< 41016 1727204179.39345: stdout chunk (state=3): >>> # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b4265c40> <<< 41016 1727204179.39379: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4265a00><<< 41016 1727204179.39382: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 41016 1727204179.39703: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b42681d0> <<< 41016 1727204179.39740: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4266330> <<< 41016 1727204179.39751: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 41016 1727204179.39845: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py<<< 41016 1727204179.39853: stdout chunk (state=3): >>> <<< 41016 1727204179.39868: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc'<<< 41016 1727204179.39891: stdout chunk (state=3): >>> import '_string' # <<< 41016 1727204179.40166: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b426b8c0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b42682c0> <<< 41016 1727204179.40274: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 41016 1727204179.40298: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b426c620> <<< 41016 1727204179.40357: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 41016 1727204179.40387: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 41016 1727204179.40389: stdout chunk (state=3): >>>import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b426c890> <<< 41016 1727204179.40467: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 41016 1727204179.40470: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 41016 1727204179.40492: stdout chunk (state=3): >>>import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b426cb90> <<< 41016 1727204179.40524: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4264350> <<< 41016 1727204179.40561: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py <<< 41016 1727204179.40584: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc'<<< 41016 1727204179.40591: stdout chunk (state=3): >>> <<< 41016 1727204179.40625: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 41016 1727204179.40672: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 41016 1727204179.40727: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 41016 1727204179.40772: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b42f8320><<< 41016 1727204179.40993: stdout chunk (state=3): >>> <<< 41016 1727204179.41045: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 41016 1727204179.41081: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 41016 1727204179.41084: stdout chunk (state=3): >>>import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b42f99a0> <<< 41016 1727204179.41117: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b426eab0><<< 41016 1727204179.41122: stdout chunk (state=3): >>> <<< 41016 1727204179.41168: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 41016 1727204179.41191: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so'<<< 41016 1727204179.41202: stdout chunk (state=3): >>> import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b426fe60> <<< 41016 1727204179.41221: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b426e6f0> <<< 41016 1727204179.41253: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204179.41283: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204179.41306: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # <<< 41016 1727204179.41342: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204179.41490: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204179.41637: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204179.41643: stdout chunk (state=3): >>> <<< 41016 1727204179.41683: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204179.41694: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 41016 1727204179.41745: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 41016 1727204179.41774: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # <<< 41016 1727204179.41787: stdout chunk (state=3): >>> # zipimport: zlib available <<< 41016 1727204179.41981: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204179.42098: stdout chunk (state=3): >>> <<< 41016 1727204179.42178: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204179.43154: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204179.43157: stdout chunk (state=3): >>> <<< 41016 1727204179.44131: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 41016 1727204179.44139: stdout chunk (state=3): >>> <<< 41016 1727204179.44157: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 41016 1727204179.44173: stdout chunk (state=3): >>> <<< 41016 1727204179.44189: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # <<< 41016 1727204179.44205: stdout chunk (state=3): >>> import 'ansible.module_utils.common.text.converters' # <<< 41016 1727204179.44214: stdout chunk (state=3): >>> <<< 41016 1727204179.44245: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py<<< 41016 1727204179.44291: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 41016 1727204179.44370: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 41016 1727204179.44498: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b42fdb80> <<< 41016 1727204179.44525: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 41016 1727204179.44543: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc'<<< 41016 1727204179.44546: stdout chunk (state=3): >>> <<< 41016 1727204179.44579: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b42fe900><<< 41016 1727204179.44585: stdout chunk (state=3): >>> <<< 41016 1727204179.44613: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4d67da0><<< 41016 1727204179.44616: stdout chunk (state=3): >>> <<< 41016 1727204179.44696: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 41016 1727204179.44703: stdout chunk (state=3): >>> <<< 41016 1727204179.44729: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204179.44732: stdout chunk (state=3): >>> <<< 41016 1727204179.44768: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204179.44778: stdout chunk (state=3): >>> <<< 41016 1727204179.44805: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 41016 1727204179.45001: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204179.45090: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204179.45099: stdout chunk (state=3): >>> <<< 41016 1727204179.45366: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 41016 1727204179.45418: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b42fe600> # zipimport: zlib available <<< 41016 1727204179.46192: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204179.46198: stdout chunk (state=3): >>> <<< 41016 1727204179.47013: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204179.47016: stdout chunk (state=3): >>> <<< 41016 1727204179.47149: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204179.47155: stdout chunk (state=3): >>> <<< 41016 1727204179.47285: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 41016 1727204179.47293: stdout chunk (state=3): >>> <<< 41016 1727204179.47322: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204179.47388: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204179.47392: stdout chunk (state=3): >>> <<< 41016 1727204179.47434: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 41016 1727204179.47461: stdout chunk (state=3): >>> # zipimport: zlib available <<< 41016 1727204179.47696: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.errors' # <<< 41016 1727204179.47728: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204179.47754: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204179.47772: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # <<< 41016 1727204179.47801: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204179.47807: stdout chunk (state=3): >>> <<< 41016 1727204179.47869: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204179.47876: stdout chunk (state=3): >>> <<< 41016 1727204179.47927: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 41016 1727204179.47941: stdout chunk (state=3): >>> <<< 41016 1727204179.47958: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204179.48373: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204179.48382: stdout chunk (state=3): >>> <<< 41016 1727204179.48791: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py<<< 41016 1727204179.48912: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 41016 1727204179.48948: stdout chunk (state=3): >>>import '_ast' # <<< 41016 1727204179.49068: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b42ffbf0> <<< 41016 1727204179.49102: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204179.49241: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204179.49366: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 41016 1727204179.49386: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # <<< 41016 1727204179.49404: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # <<< 41016 1727204179.49439: stdout chunk (state=3): >>> import 'ansible.module_utils.common.arg_spec' # <<< 41016 1727204179.49447: stdout chunk (state=3): >>> <<< 41016 1727204179.49479: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204179.49558: stdout chunk (state=3): >>> # zipimport: zlib available <<< 41016 1727204179.49623: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 41016 1727204179.49649: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204179.49726: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204179.49802: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204179.49994: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204179.49997: stdout chunk (state=3): >>> # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 41016 1727204179.50083: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc'<<< 41016 1727204179.50089: stdout chunk (state=3): >>> <<< 41016 1727204179.50246: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so'<<< 41016 1727204179.50260: stdout chunk (state=3): >>> # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so'<<< 41016 1727204179.50273: stdout chunk (state=3): >>> import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b410a5d0> <<< 41016 1727204179.50346: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4106f90><<< 41016 1727204179.50349: stdout chunk (state=3): >>> <<< 41016 1727204179.50387: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 41016 1727204179.50403: stdout chunk (state=3): >>> import 'ansible.module_utils.common.process' # <<< 41016 1727204179.50431: stdout chunk (state=3): >>> # zipimport: zlib available<<< 41016 1727204179.50437: stdout chunk (state=3): >>> <<< 41016 1727204179.50542: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204179.50594: stdout chunk (state=3): >>> <<< 41016 1727204179.50659: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204179.50662: stdout chunk (state=3): >>> <<< 41016 1727204179.50706: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204179.50712: stdout chunk (state=3): >>> <<< 41016 1727204179.50781: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py<<< 41016 1727204179.50787: stdout chunk (state=3): >>> # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc'<<< 41016 1727204179.50828: stdout chunk (state=3): >>> # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py<<< 41016 1727204179.50832: stdout chunk (state=3): >>> <<< 41016 1727204179.50874: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc'<<< 41016 1727204179.50923: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 41016 1727204179.51030: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 41016 1727204179.51091: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 41016 1727204179.51194: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b43ded50><<< 41016 1727204179.51260: stdout chunk (state=3): >>> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b43eea20><<< 41016 1727204179.51265: stdout chunk (state=3): >>> <<< 41016 1727204179.51379: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b410a390><<< 41016 1727204179.51396: stdout chunk (state=3): >>> <<< 41016 1727204179.51403: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b42fc470><<< 41016 1727204179.51430: stdout chunk (state=3): >>> <<< 41016 1727204179.51433: stdout chunk (state=3): >>># destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 41016 1727204179.51436: stdout chunk (state=3): >>> <<< 41016 1727204179.51518: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 41016 1727204179.51520: stdout chunk (state=3): >>> <<< 41016 1727204179.51557: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 41016 1727204179.51568: stdout chunk (state=3): >>> <<< 41016 1727204179.51583: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 41016 1727204179.51588: stdout chunk (state=3): >>> <<< 41016 1727204179.51669: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 41016 1727204179.51699: stdout chunk (state=3): >>> # zipimport: zlib available <<< 41016 1727204179.51729: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204179.51736: stdout chunk (state=3): >>> import 'ansible.modules' # <<< 41016 1727204179.51767: stdout chunk (state=3): >>> # zipimport: zlib available <<< 41016 1727204179.51996: stdout chunk (state=3): >>># zipimport: zlib available<<< 41016 1727204179.52091: stdout chunk (state=3): >>> <<< 41016 1727204179.52334: stdout chunk (state=3): >>># zipimport: zlib available <<< 41016 1727204179.52510: stdout chunk (state=3): >>> <<< 41016 1727204179.52520: stdout chunk (state=3): >>>{"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 41016 1727204179.52583: stdout chunk (state=3): >>># destroy __main__ <<< 41016 1727204179.53189: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2<<< 41016 1727204179.53202: stdout chunk (state=3): >>> # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path<<< 41016 1727204179.53219: stdout chunk (state=3): >>> # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport<<< 41016 1727204179.53334: stdout chunk (state=3): >>> # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string<<< 41016 1727204179.53351: stdout chunk (state=3): >>> # destroy string<<< 41016 1727204179.53367: stdout chunk (state=3): >>> # cleanup[2] removing logging # cleanup[2] removing systemd._journal<<< 41016 1727204179.53373: stdout chunk (state=3): >>> # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128<<< 41016 1727204179.53399: stdout chunk (state=3): >>> # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket<<< 41016 1727204179.53427: stdout chunk (state=3): >>> # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text <<< 41016 1727204179.53436: stdout chunk (state=3): >>># destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes<<< 41016 1727204179.53448: stdout chunk (state=3): >>> # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings<<< 41016 1727204179.53470: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters<<< 41016 1727204179.53496: stdout chunk (state=3): >>> # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4<<< 41016 1727204179.53509: stdout chunk (state=3): >>> # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file <<< 41016 1727204179.53528: stdout chunk (state=3): >>># destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro<<< 41016 1727204179.53596: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 41016 1727204179.54082: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 41016 1727204179.54116: stdout chunk (state=3): >>># destroy importlib.machinery <<< 41016 1727204179.54128: stdout chunk (state=3): >>># destroy importlib._abc <<< 41016 1727204179.54131: stdout chunk (state=3): >>># destroy importlib.util <<< 41016 1727204179.54155: stdout chunk (state=3): >>># destroy _bz2<<< 41016 1727204179.54183: stdout chunk (state=3): >>> # destroy _compression # destroy _lzma<<< 41016 1727204179.54190: stdout chunk (state=3): >>> # destroy _blake2<<< 41016 1727204179.54208: stdout chunk (state=3): >>> # destroy binascii<<< 41016 1727204179.54214: stdout chunk (state=3): >>> # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path<<< 41016 1727204179.54241: stdout chunk (state=3): >>> # destroy zipfile <<< 41016 1727204179.54254: stdout chunk (state=3): >>># destroy pathlib # destroy zipfile._path.glob<<< 41016 1727204179.54310: stdout chunk (state=3): >>> # destroy fnmatch # destroy ipaddress # destroy ntpath<<< 41016 1727204179.54316: stdout chunk (state=3): >>> <<< 41016 1727204179.54336: stdout chunk (state=3): >>># destroy importlib<<< 41016 1727204179.54345: stdout chunk (state=3): >>> # destroy zipimport<<< 41016 1727204179.54356: stdout chunk (state=3): >>> <<< 41016 1727204179.54377: stdout chunk (state=3): >>># destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder<<< 41016 1727204179.54396: stdout chunk (state=3): >>> # destroy json.encoder # destroy json.scanner<<< 41016 1727204179.54406: stdout chunk (state=3): >>> # destroy _json<<< 41016 1727204179.54422: stdout chunk (state=3): >>> # destroy grp<<< 41016 1727204179.54425: stdout chunk (state=3): >>> # destroy encodings # destroy _locale<<< 41016 1727204179.54443: stdout chunk (state=3): >>> <<< 41016 1727204179.54454: stdout chunk (state=3): >>># destroy pwd # destroy locale<<< 41016 1727204179.54480: stdout chunk (state=3): >>> # destroy signal # destroy fcntl # destroy select # destroy _signal<<< 41016 1727204179.54489: stdout chunk (state=3): >>> # destroy _posixsubprocess<<< 41016 1727204179.54513: stdout chunk (state=3): >>> # destroy syslog<<< 41016 1727204179.54549: stdout chunk (state=3): >>> # destroy uuid # destroy selectors<<< 41016 1727204179.54561: stdout chunk (state=3): >>> # destroy errno<<< 41016 1727204179.54565: stdout chunk (state=3): >>> <<< 41016 1727204179.54590: stdout chunk (state=3): >>># destroy array # destroy datetime<<< 41016 1727204179.54636: stdout chunk (state=3): >>> # destroy selinux <<< 41016 1727204179.54639: stdout chunk (state=3): >>># destroy shutil <<< 41016 1727204179.54668: stdout chunk (state=3): >>># destroy distro<<< 41016 1727204179.54678: stdout chunk (state=3): >>> # destroy distro.distro # destroy argparse # destroy json # destroy logging<<< 41016 1727204179.54775: stdout chunk (state=3): >>> # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux <<< 41016 1727204179.54804: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian <<< 41016 1727204179.54840: stdout chunk (state=3): >>># cleanup[3] wiping _ctypes <<< 41016 1727204179.54845: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves.collections_abc<<< 41016 1727204179.54851: stdout chunk (state=3): >>> # cleanup[3] wiping ansible.module_utils.six.moves <<< 41016 1727204179.54870: stdout chunk (state=3): >>># cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 41016 1727204179.54877: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128<<< 41016 1727204179.54895: stdout chunk (state=3): >>> # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal<<< 41016 1727204179.54911: stdout chunk (state=3): >>> # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache<<< 41016 1727204179.54928: stdout chunk (state=3): >>> # destroy textwrap # cleanup[3] wiping tokenize<<< 41016 1727204179.54932: stdout chunk (state=3): >>> # cleanup[3] wiping _tokenize<<< 41016 1727204179.54961: stdout chunk (state=3): >>> # cleanup[3] wiping platform # cleanup[3] wiping atexit <<< 41016 1727204179.54980: stdout chunk (state=3): >>># cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib<<< 41016 1727204179.54995: stdout chunk (state=3): >>> # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib<<< 41016 1727204179.55010: stdout chunk (state=3): >>> # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings<<< 41016 1727204179.55014: stdout chunk (state=3): >>> # cleanup[3] wiping importlib._bootstrap_external<<< 41016 1727204179.55033: stdout chunk (state=3): >>> # cleanup[3] wiping importlib._bootstrap<<< 41016 1727204179.55037: stdout chunk (state=3): >>> # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix<<< 41016 1727204179.55064: stdout chunk (state=3): >>> # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg<<< 41016 1727204179.55074: stdout chunk (state=3): >>> # cleanup[3] wiping re._parser # cleanup[3] wiping _sre<<< 41016 1727204179.55097: stdout chunk (state=3): >>> # cleanup[3] wiping functools # cleanup[3] wiping _functools<<< 41016 1727204179.55156: stdout chunk (state=3): >>> # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types<<< 41016 1727204179.55187: stdout chunk (state=3): >>> # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs<<< 41016 1727204179.55237: stdout chunk (state=3): >>> # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys<<< 41016 1727204179.55246: stdout chunk (state=3): >>> # cleanup[3] wiping builtins <<< 41016 1727204179.55271: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime<<< 41016 1727204179.55680: stdout chunk (state=3): >>> <<< 41016 1727204179.55736: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg<<< 41016 1727204179.55747: stdout chunk (state=3): >>> # destroy contextlib<<< 41016 1727204179.55788: stdout chunk (state=3): >>> # destroy _typing <<< 41016 1727204179.55829: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator<<< 41016 1727204179.55852: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal<<< 41016 1727204179.55892: stdout chunk (state=3): >>> # clear sys.meta_path <<< 41016 1727204179.55925: stdout chunk (state=3): >>># clear sys.modules <<< 41016 1727204179.55994: stdout chunk (state=3): >>># destroy _frozen_importlib <<< 41016 1727204179.56063: stdout chunk (state=3): >>># destroy codecs <<< 41016 1727204179.56105: stdout chunk (state=3): >>># destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig <<< 41016 1727204179.56151: stdout chunk (state=3): >>># destroy encodings.cp437 # destroy _codecs <<< 41016 1727204179.56174: stdout chunk (state=3): >>># destroy io # destroy traceback<<< 41016 1727204179.56199: stdout chunk (state=3): >>> # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 41016 1727204179.56234: stdout chunk (state=3): >>># destroy _random <<< 41016 1727204179.56266: stdout chunk (state=3): >>># destroy _weakref <<< 41016 1727204179.56296: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator # destroy _string<<< 41016 1727204179.56330: stdout chunk (state=3): >>> # destroy re # destroy itertools <<< 41016 1727204179.56371: stdout chunk (state=3): >>># destroy _abc<<< 41016 1727204179.56389: stdout chunk (state=3): >>> # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread<<< 41016 1727204179.56413: stdout chunk (state=3): >>> # clear sys.audit hooks <<< 41016 1727204179.57080: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 41016 1727204179.57084: stdout chunk (state=3): >>><<< 41016 1727204179.57086: stderr chunk (state=3): >>><<< 41016 1727204179.57100: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4dbc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4d8bb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4dbea50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4dcd130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4dce060> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4babf50> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4bc00e0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4be3950> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4be3fe0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4bc3bf0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4bc1340> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4ba9100> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4c078f0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4c06510> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4bc21e0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4c04c50> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4c34920> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4ba8380> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b4c34dd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4c34c80> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b4c35070> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4ba6ea0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4c35730> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4c35430> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4c36630> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4c50860> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b4c51fa0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4c52e40> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b4c53470> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4c52390> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b4c53ef0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4c53620> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4c36690> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b49cfda0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b49f8890> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b49f85f0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b49f88c0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b49f91f0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b49f9be0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b49f8aa0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b49cdf40> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b49faf90> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b49f9a60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4c36d80> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4a232f0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4a476e0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4aa84d0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4aaac30> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4aa85f0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4a714c0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4325550> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4a464e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b49fbec0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fd5b4a46ae0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_hbx12uac/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b43772c0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b435a1b0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4359310> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4375160> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b43a2c90> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b43a2a20> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b43a2330> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b43a2d50> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4377ce0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b43a38f0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b43a3b00> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b43a3f20> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b420dcd0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b420f8f0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b42142f0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4215460> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4217f50> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b421c0b0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4216240> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b421fe90> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b421e960> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b421e6c0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b421ec30> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4216750> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b4263ef0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b42640e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b4265c40> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4265a00> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b42681d0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4266330> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b426b8c0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b42682c0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b426c620> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b426c890> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b426cb90> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4264350> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b42f8320> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b42f99a0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b426eab0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b426fe60> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b426e6f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b42fdb80> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b42fe900> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4d67da0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b42fe600> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b42ffbf0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd5b410a5d0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b4106f90> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b43ded50> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b43eea20> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b410a390> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd5b42fc470> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 41016 1727204179.57844: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204178.9213998-41305-203616006487918/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204179.57848: _low_level_execute_command(): starting 41016 1727204179.57850: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204178.9213998-41305-203616006487918/ > /dev/null 2>&1 && sleep 0' 41016 1727204179.58226: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204179.58229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204179.58231: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204179.58234: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204179.58235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204179.58237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 41016 1727204179.58239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204179.58311: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204179.58314: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204179.58316: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204179.58445: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 41016 1727204179.61443: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204179.61447: stdout chunk (state=3): >>><<< 41016 1727204179.61482: stderr chunk (state=3): >>><<< 41016 1727204179.61485: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 41016 1727204179.61488: handler run complete 41016 1727204179.61690: attempt loop complete, returning result 41016 1727204179.61694: _execute() done 41016 1727204179.61696: dumping result to json 41016 1727204179.61698: done dumping result, returning 41016 1727204179.61700: done running TaskExecutor() for managed-node1/TASK: Check if system is ostree [028d2410-947f-12d5-0ec4-0000000000d2] 41016 1727204179.61702: sending task result for task 028d2410-947f-12d5-0ec4-0000000000d2 ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 41016 1727204179.61831: no more pending results, returning what we have 41016 1727204179.61835: results queue empty 41016 1727204179.61836: checking for any_errors_fatal 41016 1727204179.61843: done checking for any_errors_fatal 41016 1727204179.61844: checking for max_fail_percentage 41016 1727204179.61846: done checking for max_fail_percentage 41016 1727204179.61847: checking to see if all hosts have failed and the running result is not ok 41016 1727204179.61848: done checking to see if all hosts have failed 41016 1727204179.61848: getting the remaining hosts for this loop 41016 1727204179.61850: done getting the remaining hosts for this loop 41016 1727204179.61969: getting the next task for host managed-node1 41016 1727204179.61979: done getting next task for host managed-node1 41016 1727204179.61982: ^ task is: TASK: Set flag to indicate system is ostree 41016 1727204179.61985: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204179.61989: getting variables 41016 1727204179.61991: in VariableManager get_vars() 41016 1727204179.62025: Calling all_inventory to load vars for managed-node1 41016 1727204179.62029: Calling groups_inventory to load vars for managed-node1 41016 1727204179.62033: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204179.62046: Calling all_plugins_play to load vars for managed-node1 41016 1727204179.62049: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204179.62054: Calling groups_plugins_play to load vars for managed-node1 41016 1727204179.62582: done sending task result for task 028d2410-947f-12d5-0ec4-0000000000d2 41016 1727204179.62586: WORKER PROCESS EXITING 41016 1727204179.62746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204179.63078: done with get_vars() 41016 1727204179.63089: done getting variables 41016 1727204179.63181: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 14:56:19 -0400 (0:00:00.813) 0:00:03.308 ***** 41016 1727204179.63208: entering _queue_task() for managed-node1/set_fact 41016 1727204179.63210: Creating lock for set_fact 41016 1727204179.63485: worker is 1 (out of 1 available) 41016 1727204179.63495: exiting _queue_task() for managed-node1/set_fact 41016 1727204179.63506: done queuing things up, now waiting for results queue to drain 41016 1727204179.63507: waiting for pending results... 41016 1727204179.63711: running TaskExecutor() for managed-node1/TASK: Set flag to indicate system is ostree 41016 1727204179.63815: in run() - task 028d2410-947f-12d5-0ec4-0000000000d3 41016 1727204179.63833: variable 'ansible_search_path' from source: unknown 41016 1727204179.63839: variable 'ansible_search_path' from source: unknown 41016 1727204179.63878: calling self._execute() 41016 1727204179.63961: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204179.63974: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204179.63991: variable 'omit' from source: magic vars 41016 1727204179.64517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41016 1727204179.64755: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41016 1727204179.64807: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41016 1727204179.64843: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41016 1727204179.64884: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41016 1727204179.64967: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41016 1727204179.65001: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41016 1727204179.65031: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204179.65059: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41016 1727204179.65183: Evaluated conditional (not __network_is_ostree is defined): True 41016 1727204179.65197: variable 'omit' from source: magic vars 41016 1727204179.65234: variable 'omit' from source: magic vars 41016 1727204179.65413: variable '__ostree_booted_stat' from source: set_fact 41016 1727204179.65416: variable 'omit' from source: magic vars 41016 1727204179.65433: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204179.65462: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204179.65485: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204179.65512: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204179.65592: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204179.65624: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204179.65631: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204179.65680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204179.65771: Set connection var ansible_shell_executable to /bin/sh 41016 1727204179.65791: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204179.65802: Set connection var ansible_shell_type to sh 41016 1727204179.65811: Set connection var ansible_timeout to 10 41016 1727204179.65820: Set connection var ansible_pipelining to False 41016 1727204179.65830: Set connection var ansible_connection to ssh 41016 1727204179.65855: variable 'ansible_shell_executable' from source: unknown 41016 1727204179.65882: variable 'ansible_connection' from source: unknown 41016 1727204179.65884: variable 'ansible_module_compression' from source: unknown 41016 1727204179.65887: variable 'ansible_shell_type' from source: unknown 41016 1727204179.65981: variable 'ansible_shell_executable' from source: unknown 41016 1727204179.65985: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204179.65987: variable 'ansible_pipelining' from source: unknown 41016 1727204179.65989: variable 'ansible_timeout' from source: unknown 41016 1727204179.65991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204179.66026: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204179.66042: variable 'omit' from source: magic vars 41016 1727204179.66053: starting attempt loop 41016 1727204179.66061: running the handler 41016 1727204179.66076: handler run complete 41016 1727204179.66090: attempt loop complete, returning result 41016 1727204179.66096: _execute() done 41016 1727204179.66106: dumping result to json 41016 1727204179.66113: done dumping result, returning 41016 1727204179.66123: done running TaskExecutor() for managed-node1/TASK: Set flag to indicate system is ostree [028d2410-947f-12d5-0ec4-0000000000d3] 41016 1727204179.66130: sending task result for task 028d2410-947f-12d5-0ec4-0000000000d3 ok: [managed-node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 41016 1727204179.66363: no more pending results, returning what we have 41016 1727204179.66367: results queue empty 41016 1727204179.66367: checking for any_errors_fatal 41016 1727204179.66373: done checking for any_errors_fatal 41016 1727204179.66374: checking for max_fail_percentage 41016 1727204179.66378: done checking for max_fail_percentage 41016 1727204179.66482: checking to see if all hosts have failed and the running result is not ok 41016 1727204179.66484: done checking to see if all hosts have failed 41016 1727204179.66484: getting the remaining hosts for this loop 41016 1727204179.66486: done getting the remaining hosts for this loop 41016 1727204179.66490: getting the next task for host managed-node1 41016 1727204179.66498: done getting next task for host managed-node1 41016 1727204179.66501: ^ task is: TASK: Fix CentOS6 Base repo 41016 1727204179.66504: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204179.66507: getting variables 41016 1727204179.66508: in VariableManager get_vars() 41016 1727204179.66533: Calling all_inventory to load vars for managed-node1 41016 1727204179.66536: Calling groups_inventory to load vars for managed-node1 41016 1727204179.66539: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204179.66547: Calling all_plugins_play to load vars for managed-node1 41016 1727204179.66550: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204179.66553: Calling groups_plugins_play to load vars for managed-node1 41016 1727204179.66804: done sending task result for task 028d2410-947f-12d5-0ec4-0000000000d3 41016 1727204179.66812: WORKER PROCESS EXITING 41016 1727204179.66834: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204179.67032: done with get_vars() 41016 1727204179.67041: done getting variables 41016 1727204179.67155: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 14:56:19 -0400 (0:00:00.039) 0:00:03.347 ***** 41016 1727204179.67191: entering _queue_task() for managed-node1/copy 41016 1727204179.67451: worker is 1 (out of 1 available) 41016 1727204179.67462: exiting _queue_task() for managed-node1/copy 41016 1727204179.67473: done queuing things up, now waiting for results queue to drain 41016 1727204179.67474: waiting for pending results... 41016 1727204179.67721: running TaskExecutor() for managed-node1/TASK: Fix CentOS6 Base repo 41016 1727204179.67822: in run() - task 028d2410-947f-12d5-0ec4-0000000000d5 41016 1727204179.67840: variable 'ansible_search_path' from source: unknown 41016 1727204179.67846: variable 'ansible_search_path' from source: unknown 41016 1727204179.67885: calling self._execute() 41016 1727204179.67966: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204179.67983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204179.68080: variable 'omit' from source: magic vars 41016 1727204179.68474: variable 'ansible_distribution' from source: facts 41016 1727204179.68501: Evaluated conditional (ansible_distribution == 'CentOS'): True 41016 1727204179.68628: variable 'ansible_distribution_major_version' from source: facts 41016 1727204179.68643: Evaluated conditional (ansible_distribution_major_version == '6'): False 41016 1727204179.68650: when evaluation is False, skipping this task 41016 1727204179.68658: _execute() done 41016 1727204179.68665: dumping result to json 41016 1727204179.68672: done dumping result, returning 41016 1727204179.68683: done running TaskExecutor() for managed-node1/TASK: Fix CentOS6 Base repo [028d2410-947f-12d5-0ec4-0000000000d5] 41016 1727204179.68691: sending task result for task 028d2410-947f-12d5-0ec4-0000000000d5 41016 1727204179.68926: done sending task result for task 028d2410-947f-12d5-0ec4-0000000000d5 41016 1727204179.68929: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 41016 1727204179.68989: no more pending results, returning what we have 41016 1727204179.68993: results queue empty 41016 1727204179.68993: checking for any_errors_fatal 41016 1727204179.68998: done checking for any_errors_fatal 41016 1727204179.68999: checking for max_fail_percentage 41016 1727204179.69001: done checking for max_fail_percentage 41016 1727204179.69001: checking to see if all hosts have failed and the running result is not ok 41016 1727204179.69002: done checking to see if all hosts have failed 41016 1727204179.69003: getting the remaining hosts for this loop 41016 1727204179.69004: done getting the remaining hosts for this loop 41016 1727204179.69008: getting the next task for host managed-node1 41016 1727204179.69017: done getting next task for host managed-node1 41016 1727204179.69020: ^ task is: TASK: Include the task 'enable_epel.yml' 41016 1727204179.69022: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204179.69025: getting variables 41016 1727204179.69027: in VariableManager get_vars() 41016 1727204179.69054: Calling all_inventory to load vars for managed-node1 41016 1727204179.69057: Calling groups_inventory to load vars for managed-node1 41016 1727204179.69060: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204179.69071: Calling all_plugins_play to load vars for managed-node1 41016 1727204179.69073: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204179.69078: Calling groups_plugins_play to load vars for managed-node1 41016 1727204179.69346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204179.69715: done with get_vars() 41016 1727204179.69724: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 14:56:19 -0400 (0:00:00.026) 0:00:03.374 ***** 41016 1727204179.69811: entering _queue_task() for managed-node1/include_tasks 41016 1727204179.70050: worker is 1 (out of 1 available) 41016 1727204179.70059: exiting _queue_task() for managed-node1/include_tasks 41016 1727204179.70069: done queuing things up, now waiting for results queue to drain 41016 1727204179.70070: waiting for pending results... 41016 1727204179.70307: running TaskExecutor() for managed-node1/TASK: Include the task 'enable_epel.yml' 41016 1727204179.70411: in run() - task 028d2410-947f-12d5-0ec4-0000000000d6 41016 1727204179.70430: variable 'ansible_search_path' from source: unknown 41016 1727204179.70436: variable 'ansible_search_path' from source: unknown 41016 1727204179.70480: calling self._execute() 41016 1727204179.70547: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204179.70680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204179.70685: variable 'omit' from source: magic vars 41016 1727204179.71019: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41016 1727204179.73269: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41016 1727204179.73345: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41016 1727204179.73388: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41016 1727204179.73437: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41016 1727204179.73467: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41016 1727204179.73557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204179.73593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204179.73628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204179.73672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204179.73693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204179.73815: variable '__network_is_ostree' from source: set_fact 41016 1727204179.73836: Evaluated conditional (not __network_is_ostree | d(false)): True 41016 1727204179.73881: _execute() done 41016 1727204179.73884: dumping result to json 41016 1727204179.73886: done dumping result, returning 41016 1727204179.73887: done running TaskExecutor() for managed-node1/TASK: Include the task 'enable_epel.yml' [028d2410-947f-12d5-0ec4-0000000000d6] 41016 1727204179.73889: sending task result for task 028d2410-947f-12d5-0ec4-0000000000d6 41016 1727204179.74147: done sending task result for task 028d2410-947f-12d5-0ec4-0000000000d6 41016 1727204179.74150: WORKER PROCESS EXITING 41016 1727204179.74174: no more pending results, returning what we have 41016 1727204179.74182: in VariableManager get_vars() 41016 1727204179.74217: Calling all_inventory to load vars for managed-node1 41016 1727204179.74220: Calling groups_inventory to load vars for managed-node1 41016 1727204179.74224: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204179.74233: Calling all_plugins_play to load vars for managed-node1 41016 1727204179.74236: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204179.74238: Calling groups_plugins_play to load vars for managed-node1 41016 1727204179.74499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204179.74684: done with get_vars() 41016 1727204179.74692: variable 'ansible_search_path' from source: unknown 41016 1727204179.74693: variable 'ansible_search_path' from source: unknown 41016 1727204179.74730: we have included files to process 41016 1727204179.74732: generating all_blocks data 41016 1727204179.74733: done generating all_blocks data 41016 1727204179.74738: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 41016 1727204179.74740: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 41016 1727204179.74742: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 41016 1727204179.75407: done processing included file 41016 1727204179.75412: iterating over new_blocks loaded from include file 41016 1727204179.75413: in VariableManager get_vars() 41016 1727204179.75424: done with get_vars() 41016 1727204179.75426: filtering new block on tags 41016 1727204179.75446: done filtering new block on tags 41016 1727204179.75449: in VariableManager get_vars() 41016 1727204179.75460: done with get_vars() 41016 1727204179.75461: filtering new block on tags 41016 1727204179.75473: done filtering new block on tags 41016 1727204179.75477: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node1 41016 1727204179.75482: extending task lists for all hosts with included blocks 41016 1727204179.75572: done extending task lists 41016 1727204179.75574: done processing included files 41016 1727204179.75575: results queue empty 41016 1727204179.75577: checking for any_errors_fatal 41016 1727204179.75580: done checking for any_errors_fatal 41016 1727204179.75580: checking for max_fail_percentage 41016 1727204179.75581: done checking for max_fail_percentage 41016 1727204179.75582: checking to see if all hosts have failed and the running result is not ok 41016 1727204179.75583: done checking to see if all hosts have failed 41016 1727204179.75584: getting the remaining hosts for this loop 41016 1727204179.75585: done getting the remaining hosts for this loop 41016 1727204179.75587: getting the next task for host managed-node1 41016 1727204179.75590: done getting next task for host managed-node1 41016 1727204179.75592: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 41016 1727204179.75594: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204179.75596: getting variables 41016 1727204179.75597: in VariableManager get_vars() 41016 1727204179.75604: Calling all_inventory to load vars for managed-node1 41016 1727204179.75606: Calling groups_inventory to load vars for managed-node1 41016 1727204179.75610: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204179.75616: Calling all_plugins_play to load vars for managed-node1 41016 1727204179.75623: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204179.75626: Calling groups_plugins_play to load vars for managed-node1 41016 1727204179.75773: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204179.75951: done with get_vars() 41016 1727204179.75960: done getting variables 41016 1727204179.76026: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 41016 1727204179.76221: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 14:56:19 -0400 (0:00:00.064) 0:00:03.438 ***** 41016 1727204179.76263: entering _queue_task() for managed-node1/command 41016 1727204179.76265: Creating lock for command 41016 1727204179.76558: worker is 1 (out of 1 available) 41016 1727204179.76568: exiting _queue_task() for managed-node1/command 41016 1727204179.76581: done queuing things up, now waiting for results queue to drain 41016 1727204179.76582: waiting for pending results... 41016 1727204179.76834: running TaskExecutor() for managed-node1/TASK: Create EPEL 10 41016 1727204179.76950: in run() - task 028d2410-947f-12d5-0ec4-0000000000f0 41016 1727204179.77181: variable 'ansible_search_path' from source: unknown 41016 1727204179.77184: variable 'ansible_search_path' from source: unknown 41016 1727204179.77187: calling self._execute() 41016 1727204179.77189: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204179.77190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204179.77192: variable 'omit' from source: magic vars 41016 1727204179.77464: variable 'ansible_distribution' from source: facts 41016 1727204179.77484: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 41016 1727204179.77687: variable 'ansible_distribution_major_version' from source: facts 41016 1727204179.77698: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 41016 1727204179.77707: when evaluation is False, skipping this task 41016 1727204179.77719: _execute() done 41016 1727204179.77726: dumping result to json 41016 1727204179.77733: done dumping result, returning 41016 1727204179.77961: done running TaskExecutor() for managed-node1/TASK: Create EPEL 10 [028d2410-947f-12d5-0ec4-0000000000f0] 41016 1727204179.77964: sending task result for task 028d2410-947f-12d5-0ec4-0000000000f0 41016 1727204179.78043: done sending task result for task 028d2410-947f-12d5-0ec4-0000000000f0 41016 1727204179.78046: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 41016 1727204179.78123: no more pending results, returning what we have 41016 1727204179.78127: results queue empty 41016 1727204179.78128: checking for any_errors_fatal 41016 1727204179.78129: done checking for any_errors_fatal 41016 1727204179.78130: checking for max_fail_percentage 41016 1727204179.78132: done checking for max_fail_percentage 41016 1727204179.78132: checking to see if all hosts have failed and the running result is not ok 41016 1727204179.78133: done checking to see if all hosts have failed 41016 1727204179.78134: getting the remaining hosts for this loop 41016 1727204179.78136: done getting the remaining hosts for this loop 41016 1727204179.78140: getting the next task for host managed-node1 41016 1727204179.78147: done getting next task for host managed-node1 41016 1727204179.78150: ^ task is: TASK: Install yum-utils package 41016 1727204179.78154: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204179.78158: getting variables 41016 1727204179.78160: in VariableManager get_vars() 41016 1727204179.78192: Calling all_inventory to load vars for managed-node1 41016 1727204179.78195: Calling groups_inventory to load vars for managed-node1 41016 1727204179.78200: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204179.78215: Calling all_plugins_play to load vars for managed-node1 41016 1727204179.78219: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204179.78222: Calling groups_plugins_play to load vars for managed-node1 41016 1727204179.78800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204179.79228: done with get_vars() 41016 1727204179.79238: done getting variables 41016 1727204179.79554: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 14:56:19 -0400 (0:00:00.033) 0:00:03.471 ***** 41016 1727204179.79584: entering _queue_task() for managed-node1/package 41016 1727204179.79586: Creating lock for package 41016 1727204179.80285: worker is 1 (out of 1 available) 41016 1727204179.80296: exiting _queue_task() for managed-node1/package 41016 1727204179.80307: done queuing things up, now waiting for results queue to drain 41016 1727204179.80310: waiting for pending results... 41016 1727204179.80526: running TaskExecutor() for managed-node1/TASK: Install yum-utils package 41016 1727204179.80874: in run() - task 028d2410-947f-12d5-0ec4-0000000000f1 41016 1727204179.80887: variable 'ansible_search_path' from source: unknown 41016 1727204179.80890: variable 'ansible_search_path' from source: unknown 41016 1727204179.80930: calling self._execute() 41016 1727204179.81315: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204179.81319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204179.81329: variable 'omit' from source: magic vars 41016 1727204179.81984: variable 'ansible_distribution' from source: facts 41016 1727204179.82112: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 41016 1727204179.82357: variable 'ansible_distribution_major_version' from source: facts 41016 1727204179.82364: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 41016 1727204179.82367: when evaluation is False, skipping this task 41016 1727204179.82370: _execute() done 41016 1727204179.82373: dumping result to json 41016 1727204179.82379: done dumping result, returning 41016 1727204179.82386: done running TaskExecutor() for managed-node1/TASK: Install yum-utils package [028d2410-947f-12d5-0ec4-0000000000f1] 41016 1727204179.82391: sending task result for task 028d2410-947f-12d5-0ec4-0000000000f1 41016 1727204179.82761: done sending task result for task 028d2410-947f-12d5-0ec4-0000000000f1 41016 1727204179.82765: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 41016 1727204179.82869: no more pending results, returning what we have 41016 1727204179.82873: results queue empty 41016 1727204179.82874: checking for any_errors_fatal 41016 1727204179.82879: done checking for any_errors_fatal 41016 1727204179.82880: checking for max_fail_percentage 41016 1727204179.82882: done checking for max_fail_percentage 41016 1727204179.82882: checking to see if all hosts have failed and the running result is not ok 41016 1727204179.82883: done checking to see if all hosts have failed 41016 1727204179.82884: getting the remaining hosts for this loop 41016 1727204179.82885: done getting the remaining hosts for this loop 41016 1727204179.82890: getting the next task for host managed-node1 41016 1727204179.82896: done getting next task for host managed-node1 41016 1727204179.82899: ^ task is: TASK: Enable EPEL 7 41016 1727204179.82902: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204179.82905: getting variables 41016 1727204179.82907: in VariableManager get_vars() 41016 1727204179.82935: Calling all_inventory to load vars for managed-node1 41016 1727204179.82938: Calling groups_inventory to load vars for managed-node1 41016 1727204179.82942: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204179.82954: Calling all_plugins_play to load vars for managed-node1 41016 1727204179.82956: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204179.82959: Calling groups_plugins_play to load vars for managed-node1 41016 1727204179.83164: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204179.83786: done with get_vars() 41016 1727204179.83798: done getting variables 41016 1727204179.83857: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 14:56:19 -0400 (0:00:00.043) 0:00:03.514 ***** 41016 1727204179.83888: entering _queue_task() for managed-node1/command 41016 1727204179.84520: worker is 1 (out of 1 available) 41016 1727204179.84530: exiting _queue_task() for managed-node1/command 41016 1727204179.84543: done queuing things up, now waiting for results queue to drain 41016 1727204179.84544: waiting for pending results... 41016 1727204179.85098: running TaskExecutor() for managed-node1/TASK: Enable EPEL 7 41016 1727204179.85189: in run() - task 028d2410-947f-12d5-0ec4-0000000000f2 41016 1727204179.85254: variable 'ansible_search_path' from source: unknown 41016 1727204179.85281: variable 'ansible_search_path' from source: unknown 41016 1727204179.85359: calling self._execute() 41016 1727204179.85500: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204179.85990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204179.85993: variable 'omit' from source: magic vars 41016 1727204179.86538: variable 'ansible_distribution' from source: facts 41016 1727204179.86555: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 41016 1727204179.86764: variable 'ansible_distribution_major_version' from source: facts 41016 1727204179.86835: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 41016 1727204179.86844: when evaluation is False, skipping this task 41016 1727204179.86887: _execute() done 41016 1727204179.86895: dumping result to json 41016 1727204179.86903: done dumping result, returning 41016 1727204179.86914: done running TaskExecutor() for managed-node1/TASK: Enable EPEL 7 [028d2410-947f-12d5-0ec4-0000000000f2] 41016 1727204179.86923: sending task result for task 028d2410-947f-12d5-0ec4-0000000000f2 41016 1727204179.87220: done sending task result for task 028d2410-947f-12d5-0ec4-0000000000f2 41016 1727204179.87225: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 41016 1727204179.87272: no more pending results, returning what we have 41016 1727204179.87278: results queue empty 41016 1727204179.87278: checking for any_errors_fatal 41016 1727204179.87284: done checking for any_errors_fatal 41016 1727204179.87285: checking for max_fail_percentage 41016 1727204179.87286: done checking for max_fail_percentage 41016 1727204179.87287: checking to see if all hosts have failed and the running result is not ok 41016 1727204179.87288: done checking to see if all hosts have failed 41016 1727204179.87289: getting the remaining hosts for this loop 41016 1727204179.87290: done getting the remaining hosts for this loop 41016 1727204179.87294: getting the next task for host managed-node1 41016 1727204179.87301: done getting next task for host managed-node1 41016 1727204179.87303: ^ task is: TASK: Enable EPEL 8 41016 1727204179.87307: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204179.87313: getting variables 41016 1727204179.87315: in VariableManager get_vars() 41016 1727204179.87345: Calling all_inventory to load vars for managed-node1 41016 1727204179.87348: Calling groups_inventory to load vars for managed-node1 41016 1727204179.87351: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204179.87362: Calling all_plugins_play to load vars for managed-node1 41016 1727204179.87364: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204179.87367: Calling groups_plugins_play to load vars for managed-node1 41016 1727204179.87850: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204179.88386: done with get_vars() 41016 1727204179.88396: done getting variables 41016 1727204179.88454: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 14:56:19 -0400 (0:00:00.047) 0:00:03.562 ***** 41016 1727204179.88686: entering _queue_task() for managed-node1/command 41016 1727204179.89069: worker is 1 (out of 1 available) 41016 1727204179.89186: exiting _queue_task() for managed-node1/command 41016 1727204179.89198: done queuing things up, now waiting for results queue to drain 41016 1727204179.89200: waiting for pending results... 41016 1727204179.89489: running TaskExecutor() for managed-node1/TASK: Enable EPEL 8 41016 1727204179.89759: in run() - task 028d2410-947f-12d5-0ec4-0000000000f3 41016 1727204179.89778: variable 'ansible_search_path' from source: unknown 41016 1727204179.89786: variable 'ansible_search_path' from source: unknown 41016 1727204179.89824: calling self._execute() 41016 1727204179.89938: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204179.90181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204179.90185: variable 'omit' from source: magic vars 41016 1727204179.90567: variable 'ansible_distribution' from source: facts 41016 1727204179.90592: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 41016 1727204179.90730: variable 'ansible_distribution_major_version' from source: facts 41016 1727204179.90742: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 41016 1727204179.90750: when evaluation is False, skipping this task 41016 1727204179.90756: _execute() done 41016 1727204179.90842: dumping result to json 41016 1727204179.91102: done dumping result, returning 41016 1727204179.91105: done running TaskExecutor() for managed-node1/TASK: Enable EPEL 8 [028d2410-947f-12d5-0ec4-0000000000f3] 41016 1727204179.91107: sending task result for task 028d2410-947f-12d5-0ec4-0000000000f3 41016 1727204179.91166: done sending task result for task 028d2410-947f-12d5-0ec4-0000000000f3 41016 1727204179.91170: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 41016 1727204179.91220: no more pending results, returning what we have 41016 1727204179.91224: results queue empty 41016 1727204179.91225: checking for any_errors_fatal 41016 1727204179.91230: done checking for any_errors_fatal 41016 1727204179.91230: checking for max_fail_percentage 41016 1727204179.91232: done checking for max_fail_percentage 41016 1727204179.91232: checking to see if all hosts have failed and the running result is not ok 41016 1727204179.91233: done checking to see if all hosts have failed 41016 1727204179.91234: getting the remaining hosts for this loop 41016 1727204179.91235: done getting the remaining hosts for this loop 41016 1727204179.91239: getting the next task for host managed-node1 41016 1727204179.91248: done getting next task for host managed-node1 41016 1727204179.91252: ^ task is: TASK: Enable EPEL 6 41016 1727204179.91255: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204179.91259: getting variables 41016 1727204179.91260: in VariableManager get_vars() 41016 1727204179.91292: Calling all_inventory to load vars for managed-node1 41016 1727204179.91294: Calling groups_inventory to load vars for managed-node1 41016 1727204179.91297: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204179.91307: Calling all_plugins_play to load vars for managed-node1 41016 1727204179.91312: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204179.91314: Calling groups_plugins_play to load vars for managed-node1 41016 1727204179.91610: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204179.91999: done with get_vars() 41016 1727204179.92012: done getting variables 41016 1727204179.92278: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 14:56:19 -0400 (0:00:00.036) 0:00:03.599 ***** 41016 1727204179.92314: entering _queue_task() for managed-node1/copy 41016 1727204179.92896: worker is 1 (out of 1 available) 41016 1727204179.92912: exiting _queue_task() for managed-node1/copy 41016 1727204179.92922: done queuing things up, now waiting for results queue to drain 41016 1727204179.92924: waiting for pending results... 41016 1727204179.93295: running TaskExecutor() for managed-node1/TASK: Enable EPEL 6 41016 1727204179.93371: in run() - task 028d2410-947f-12d5-0ec4-0000000000f5 41016 1727204179.93392: variable 'ansible_search_path' from source: unknown 41016 1727204179.93399: variable 'ansible_search_path' from source: unknown 41016 1727204179.93482: calling self._execute() 41016 1727204179.93538: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204179.93551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204179.93564: variable 'omit' from source: magic vars 41016 1727204179.94282: variable 'ansible_distribution' from source: facts 41016 1727204179.94286: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 41016 1727204179.94288: variable 'ansible_distribution_major_version' from source: facts 41016 1727204179.94290: Evaluated conditional (ansible_distribution_major_version == '6'): False 41016 1727204179.94292: when evaluation is False, skipping this task 41016 1727204179.94294: _execute() done 41016 1727204179.94296: dumping result to json 41016 1727204179.94298: done dumping result, returning 41016 1727204179.94302: done running TaskExecutor() for managed-node1/TASK: Enable EPEL 6 [028d2410-947f-12d5-0ec4-0000000000f5] 41016 1727204179.94304: sending task result for task 028d2410-947f-12d5-0ec4-0000000000f5 41016 1727204179.94881: done sending task result for task 028d2410-947f-12d5-0ec4-0000000000f5 41016 1727204179.94884: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 41016 1727204179.94921: no more pending results, returning what we have 41016 1727204179.94924: results queue empty 41016 1727204179.94925: checking for any_errors_fatal 41016 1727204179.94929: done checking for any_errors_fatal 41016 1727204179.94930: checking for max_fail_percentage 41016 1727204179.94931: done checking for max_fail_percentage 41016 1727204179.94932: checking to see if all hosts have failed and the running result is not ok 41016 1727204179.94933: done checking to see if all hosts have failed 41016 1727204179.94934: getting the remaining hosts for this loop 41016 1727204179.94935: done getting the remaining hosts for this loop 41016 1727204179.94938: getting the next task for host managed-node1 41016 1727204179.94945: done getting next task for host managed-node1 41016 1727204179.94948: ^ task is: TASK: Set network provider to 'nm' 41016 1727204179.94950: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204179.94953: getting variables 41016 1727204179.94954: in VariableManager get_vars() 41016 1727204179.94979: Calling all_inventory to load vars for managed-node1 41016 1727204179.94981: Calling groups_inventory to load vars for managed-node1 41016 1727204179.94984: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204179.94992: Calling all_plugins_play to load vars for managed-node1 41016 1727204179.94995: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204179.94997: Calling groups_plugins_play to load vars for managed-node1 41016 1727204179.95624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204179.95856: done with get_vars() 41016 1727204179.95865: done getting variables 41016 1727204179.95928: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml:13 Tuesday 24 September 2024 14:56:19 -0400 (0:00:00.036) 0:00:03.635 ***** 41016 1727204179.95952: entering _queue_task() for managed-node1/set_fact 41016 1727204179.96343: worker is 1 (out of 1 available) 41016 1727204179.96355: exiting _queue_task() for managed-node1/set_fact 41016 1727204179.96366: done queuing things up, now waiting for results queue to drain 41016 1727204179.96368: waiting for pending results... 41016 1727204179.96606: running TaskExecutor() for managed-node1/TASK: Set network provider to 'nm' 41016 1727204179.96700: in run() - task 028d2410-947f-12d5-0ec4-000000000007 41016 1727204179.96724: variable 'ansible_search_path' from source: unknown 41016 1727204179.96764: calling self._execute() 41016 1727204179.97061: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204179.97183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204179.97187: variable 'omit' from source: magic vars 41016 1727204179.97378: variable 'omit' from source: magic vars 41016 1727204179.97441: variable 'omit' from source: magic vars 41016 1727204179.97489: variable 'omit' from source: magic vars 41016 1727204179.97537: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204179.97774: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204179.97804: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204179.97980: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204179.97983: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204179.97986: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204179.97989: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204179.97991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204179.97993: Set connection var ansible_shell_executable to /bin/sh 41016 1727204179.98006: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204179.98019: Set connection var ansible_shell_type to sh 41016 1727204179.98029: Set connection var ansible_timeout to 10 41016 1727204179.98039: Set connection var ansible_pipelining to False 41016 1727204179.98050: Set connection var ansible_connection to ssh 41016 1727204179.98075: variable 'ansible_shell_executable' from source: unknown 41016 1727204179.98086: variable 'ansible_connection' from source: unknown 41016 1727204179.98116: variable 'ansible_module_compression' from source: unknown 41016 1727204179.98125: variable 'ansible_shell_type' from source: unknown 41016 1727204179.98133: variable 'ansible_shell_executable' from source: unknown 41016 1727204179.98151: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204179.98161: variable 'ansible_pipelining' from source: unknown 41016 1727204179.98167: variable 'ansible_timeout' from source: unknown 41016 1727204179.98223: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204179.98369: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204179.98418: variable 'omit' from source: magic vars 41016 1727204179.98435: starting attempt loop 41016 1727204179.98443: running the handler 41016 1727204179.98459: handler run complete 41016 1727204179.98544: attempt loop complete, returning result 41016 1727204179.98547: _execute() done 41016 1727204179.98550: dumping result to json 41016 1727204179.98552: done dumping result, returning 41016 1727204179.98554: done running TaskExecutor() for managed-node1/TASK: Set network provider to 'nm' [028d2410-947f-12d5-0ec4-000000000007] 41016 1727204179.98557: sending task result for task 028d2410-947f-12d5-0ec4-000000000007 41016 1727204179.98624: done sending task result for task 028d2410-947f-12d5-0ec4-000000000007 41016 1727204179.98627: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 41016 1727204179.98705: no more pending results, returning what we have 41016 1727204179.98711: results queue empty 41016 1727204179.98712: checking for any_errors_fatal 41016 1727204179.98717: done checking for any_errors_fatal 41016 1727204179.98718: checking for max_fail_percentage 41016 1727204179.98720: done checking for max_fail_percentage 41016 1727204179.98721: checking to see if all hosts have failed and the running result is not ok 41016 1727204179.98722: done checking to see if all hosts have failed 41016 1727204179.98723: getting the remaining hosts for this loop 41016 1727204179.98724: done getting the remaining hosts for this loop 41016 1727204179.98728: getting the next task for host managed-node1 41016 1727204179.98735: done getting next task for host managed-node1 41016 1727204179.98737: ^ task is: TASK: meta (flush_handlers) 41016 1727204179.98738: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204179.98742: getting variables 41016 1727204179.98743: in VariableManager get_vars() 41016 1727204179.98769: Calling all_inventory to load vars for managed-node1 41016 1727204179.98772: Calling groups_inventory to load vars for managed-node1 41016 1727204179.98777: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204179.98788: Calling all_plugins_play to load vars for managed-node1 41016 1727204179.98791: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204179.98794: Calling groups_plugins_play to load vars for managed-node1 41016 1727204179.99346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204179.99616: done with get_vars() 41016 1727204179.99626: done getting variables 41016 1727204179.99691: in VariableManager get_vars() 41016 1727204179.99702: Calling all_inventory to load vars for managed-node1 41016 1727204179.99704: Calling groups_inventory to load vars for managed-node1 41016 1727204179.99707: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204179.99714: Calling all_plugins_play to load vars for managed-node1 41016 1727204179.99717: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204179.99719: Calling groups_plugins_play to load vars for managed-node1 41016 1727204179.99853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204180.00321: done with get_vars() 41016 1727204180.00336: done queuing things up, now waiting for results queue to drain 41016 1727204180.00338: results queue empty 41016 1727204180.00339: checking for any_errors_fatal 41016 1727204180.00341: done checking for any_errors_fatal 41016 1727204180.00342: checking for max_fail_percentage 41016 1727204180.00343: done checking for max_fail_percentage 41016 1727204180.00343: checking to see if all hosts have failed and the running result is not ok 41016 1727204180.00344: done checking to see if all hosts have failed 41016 1727204180.00345: getting the remaining hosts for this loop 41016 1727204180.00346: done getting the remaining hosts for this loop 41016 1727204180.00348: getting the next task for host managed-node1 41016 1727204180.00352: done getting next task for host managed-node1 41016 1727204180.00353: ^ task is: TASK: meta (flush_handlers) 41016 1727204180.00354: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204180.00361: getting variables 41016 1727204180.00361: in VariableManager get_vars() 41016 1727204180.00369: Calling all_inventory to load vars for managed-node1 41016 1727204180.00371: Calling groups_inventory to load vars for managed-node1 41016 1727204180.00373: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204180.00380: Calling all_plugins_play to load vars for managed-node1 41016 1727204180.00383: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204180.00385: Calling groups_plugins_play to load vars for managed-node1 41016 1727204180.00525: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204180.00749: done with get_vars() 41016 1727204180.00757: done getting variables 41016 1727204180.00806: in VariableManager get_vars() 41016 1727204180.00818: Calling all_inventory to load vars for managed-node1 41016 1727204180.00820: Calling groups_inventory to load vars for managed-node1 41016 1727204180.00823: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204180.00827: Calling all_plugins_play to load vars for managed-node1 41016 1727204180.00830: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204180.00833: Calling groups_plugins_play to load vars for managed-node1 41016 1727204180.00967: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204180.01217: done with get_vars() 41016 1727204180.01227: done queuing things up, now waiting for results queue to drain 41016 1727204180.01229: results queue empty 41016 1727204180.01230: checking for any_errors_fatal 41016 1727204180.01232: done checking for any_errors_fatal 41016 1727204180.01232: checking for max_fail_percentage 41016 1727204180.01233: done checking for max_fail_percentage 41016 1727204180.01234: checking to see if all hosts have failed and the running result is not ok 41016 1727204180.01234: done checking to see if all hosts have failed 41016 1727204180.01235: getting the remaining hosts for this loop 41016 1727204180.01236: done getting the remaining hosts for this loop 41016 1727204180.01238: getting the next task for host managed-node1 41016 1727204180.01241: done getting next task for host managed-node1 41016 1727204180.01242: ^ task is: None 41016 1727204180.01243: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204180.01244: done queuing things up, now waiting for results queue to drain 41016 1727204180.01245: results queue empty 41016 1727204180.01246: checking for any_errors_fatal 41016 1727204180.01246: done checking for any_errors_fatal 41016 1727204180.01247: checking for max_fail_percentage 41016 1727204180.01248: done checking for max_fail_percentage 41016 1727204180.01248: checking to see if all hosts have failed and the running result is not ok 41016 1727204180.01249: done checking to see if all hosts have failed 41016 1727204180.01251: getting the next task for host managed-node1 41016 1727204180.01254: done getting next task for host managed-node1 41016 1727204180.01254: ^ task is: None 41016 1727204180.01255: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204180.01303: in VariableManager get_vars() 41016 1727204180.01330: done with get_vars() 41016 1727204180.01336: in VariableManager get_vars() 41016 1727204180.01349: done with get_vars() 41016 1727204180.01353: variable 'omit' from source: magic vars 41016 1727204180.01396: in VariableManager get_vars() 41016 1727204180.01415: done with get_vars() 41016 1727204180.01438: variable 'omit' from source: magic vars PLAY [Test output device of routes] ******************************************** 41016 1727204180.02102: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 41016 1727204180.02129: getting the remaining hosts for this loop 41016 1727204180.02130: done getting the remaining hosts for this loop 41016 1727204180.02133: getting the next task for host managed-node1 41016 1727204180.02135: done getting next task for host managed-node1 41016 1727204180.02137: ^ task is: TASK: Gathering Facts 41016 1727204180.02138: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204180.02140: getting variables 41016 1727204180.02141: in VariableManager get_vars() 41016 1727204180.02153: Calling all_inventory to load vars for managed-node1 41016 1727204180.02155: Calling groups_inventory to load vars for managed-node1 41016 1727204180.02157: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204180.02161: Calling all_plugins_play to load vars for managed-node1 41016 1727204180.02174: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204180.02182: Calling groups_plugins_play to load vars for managed-node1 41016 1727204180.02331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204180.02579: done with get_vars() 41016 1727204180.02587: done getting variables 41016 1727204180.02627: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:3 Tuesday 24 September 2024 14:56:20 -0400 (0:00:00.066) 0:00:03.702 ***** 41016 1727204180.02652: entering _queue_task() for managed-node1/gather_facts 41016 1727204180.03250: worker is 1 (out of 1 available) 41016 1727204180.03260: exiting _queue_task() for managed-node1/gather_facts 41016 1727204180.03269: done queuing things up, now waiting for results queue to drain 41016 1727204180.03271: waiting for pending results... 41016 1727204180.03591: running TaskExecutor() for managed-node1/TASK: Gathering Facts 41016 1727204180.03596: in run() - task 028d2410-947f-12d5-0ec4-00000000011b 41016 1727204180.03598: variable 'ansible_search_path' from source: unknown 41016 1727204180.03601: calling self._execute() 41016 1727204180.03682: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204180.03717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204180.03732: variable 'omit' from source: magic vars 41016 1727204180.04180: variable 'ansible_distribution_major_version' from source: facts 41016 1727204180.04198: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204180.04240: variable 'omit' from source: magic vars 41016 1727204180.04271: variable 'omit' from source: magic vars 41016 1727204180.04307: variable 'omit' from source: magic vars 41016 1727204180.04348: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204180.04394: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204180.04422: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204180.04702: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204180.04710: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204180.04713: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204180.04716: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204180.04718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204180.04720: Set connection var ansible_shell_executable to /bin/sh 41016 1727204180.04722: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204180.04725: Set connection var ansible_shell_type to sh 41016 1727204180.04727: Set connection var ansible_timeout to 10 41016 1727204180.04729: Set connection var ansible_pipelining to False 41016 1727204180.04731: Set connection var ansible_connection to ssh 41016 1727204180.04733: variable 'ansible_shell_executable' from source: unknown 41016 1727204180.04735: variable 'ansible_connection' from source: unknown 41016 1727204180.04738: variable 'ansible_module_compression' from source: unknown 41016 1727204180.04740: variable 'ansible_shell_type' from source: unknown 41016 1727204180.04742: variable 'ansible_shell_executable' from source: unknown 41016 1727204180.04744: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204180.04755: variable 'ansible_pipelining' from source: unknown 41016 1727204180.04762: variable 'ansible_timeout' from source: unknown 41016 1727204180.04770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204180.05122: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204180.05140: variable 'omit' from source: magic vars 41016 1727204180.05193: starting attempt loop 41016 1727204180.05200: running the handler 41016 1727204180.05224: variable 'ansible_facts' from source: unknown 41016 1727204180.05284: _low_level_execute_command(): starting 41016 1727204180.05363: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204180.06511: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204180.06616: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204180.06748: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204180.06896: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204180.07016: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 41016 1727204180.09699: stdout chunk (state=3): >>>/root <<< 41016 1727204180.09798: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204180.09836: stderr chunk (state=3): >>><<< 41016 1727204180.09846: stdout chunk (state=3): >>><<< 41016 1727204180.09996: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 41016 1727204180.10000: _low_level_execute_command(): starting 41016 1727204180.10002: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204180.0995016-41401-56676910219003 `" && echo ansible-tmp-1727204180.0995016-41401-56676910219003="` echo /root/.ansible/tmp/ansible-tmp-1727204180.0995016-41401-56676910219003 `" ) && sleep 0' 41016 1727204180.10662: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204180.10680: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204180.10795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204180.10828: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204180.10953: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 41016 1727204180.14010: stdout chunk (state=3): >>>ansible-tmp-1727204180.0995016-41401-56676910219003=/root/.ansible/tmp/ansible-tmp-1727204180.0995016-41401-56676910219003 <<< 41016 1727204180.14229: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204180.14251: stdout chunk (state=3): >>><<< 41016 1727204180.14254: stderr chunk (state=3): >>><<< 41016 1727204180.14271: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204180.0995016-41401-56676910219003=/root/.ansible/tmp/ansible-tmp-1727204180.0995016-41401-56676910219003 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 41016 1727204180.14381: variable 'ansible_module_compression' from source: unknown 41016 1727204180.14386: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-410168h8uvyln/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 41016 1727204180.14441: variable 'ansible_facts' from source: unknown 41016 1727204180.14673: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204180.0995016-41401-56676910219003/AnsiballZ_setup.py 41016 1727204180.14926: Sending initial data 41016 1727204180.14940: Sent initial data (153 bytes) 41016 1727204180.15462: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204180.15493: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204180.15602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204180.15624: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204180.15647: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204180.15778: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 41016 1727204180.18293: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204180.18351: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204180.18497: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmpjaoyfin5 /root/.ansible/tmp/ansible-tmp-1727204180.0995016-41401-56676910219003/AnsiballZ_setup.py <<< 41016 1727204180.18500: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204180.0995016-41401-56676910219003/AnsiballZ_setup.py" <<< 41016 1727204180.18580: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmpjaoyfin5" to remote "/root/.ansible/tmp/ansible-tmp-1727204180.0995016-41401-56676910219003/AnsiballZ_setup.py" <<< 41016 1727204180.18583: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204180.0995016-41401-56676910219003/AnsiballZ_setup.py" <<< 41016 1727204180.20406: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204180.20564: stdout chunk (state=3): >>><<< 41016 1727204180.20567: stderr chunk (state=3): >>><<< 41016 1727204180.20569: done transferring module to remote 41016 1727204180.20571: _low_level_execute_command(): starting 41016 1727204180.20574: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204180.0995016-41401-56676910219003/ /root/.ansible/tmp/ansible-tmp-1727204180.0995016-41401-56676910219003/AnsiballZ_setup.py && sleep 0' 41016 1727204180.21177: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204180.21192: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204180.21207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204180.21331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204180.21357: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204180.21493: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 41016 1727204180.24193: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204180.24256: stderr chunk (state=3): >>><<< 41016 1727204180.24265: stdout chunk (state=3): >>><<< 41016 1727204180.24290: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 41016 1727204180.24393: _low_level_execute_command(): starting 41016 1727204180.24398: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204180.0995016-41401-56676910219003/AnsiballZ_setup.py && sleep 0' 41016 1727204180.24934: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204180.24949: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204180.24965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204180.24987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204180.25004: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204180.25095: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204180.25123: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204180.25236: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 41016 1727204181.13741: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_is_chroot": false, "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "20", "epoch": "1727204180", "epoch_int": "1727204180", "date": "2024-09-24", "time": "14:56:20", "iso8601_micro": "2024-09-24T18:56:20.717143Z", "iso8601": "2024-09-24T18:56:20Z", "iso8601_basic": "20240924T145620717143", "iso<<< 41016 1727204181.13797: stdout chunk (state=3): >>>8601_basic_short": "20240924T145620", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2905, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 626, "free": 2905}, "nocache": {"free": 3264, "used": 267}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_uuid": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 771, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785092096, "block_size": 4096, "block_total": 65519099, "block_available": 63912376, "block_used": 1606723, "inode_total": 131070960, "inode_available": 131027256, "inode_used": 43704, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ddff:fe89:9be5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.47"], "ansible_all_ipv6_addresses": ["fe80::8ff:ddff:fe89:9be5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.47", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ddff:fe89:9be5"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_loadavg": {"1m": 0.5888671875, "5m": 0.53857421875, "15m": 0.29931640625}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 41016 1727204181.16884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 41016 1727204181.16887: stdout chunk (state=3): >>><<< 41016 1727204181.16890: stderr chunk (state=3): >>><<< 41016 1727204181.17090: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_is_chroot": false, "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "20", "epoch": "1727204180", "epoch_int": "1727204180", "date": "2024-09-24", "time": "14:56:20", "iso8601_micro": "2024-09-24T18:56:20.717143Z", "iso8601": "2024-09-24T18:56:20Z", "iso8601_basic": "20240924T145620717143", "iso8601_basic_short": "20240924T145620", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2905, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 626, "free": 2905}, "nocache": {"free": 3264, "used": 267}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_uuid": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 771, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785092096, "block_size": 4096, "block_total": 65519099, "block_available": 63912376, "block_used": 1606723, "inode_total": 131070960, "inode_available": 131027256, "inode_used": 43704, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ddff:fe89:9be5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.47"], "ansible_all_ipv6_addresses": ["fe80::8ff:ddff:fe89:9be5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.47", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ddff:fe89:9be5"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_loadavg": {"1m": 0.5888671875, "5m": 0.53857421875, "15m": 0.29931640625}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204181.17316: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204180.0995016-41401-56676910219003/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204181.17345: _low_level_execute_command(): starting 41016 1727204181.17354: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204180.0995016-41401-56676910219003/ > /dev/null 2>&1 && sleep 0' 41016 1727204181.18005: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204181.18024: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204181.18079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204181.18152: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204181.18178: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204181.18203: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204181.18327: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 41016 1727204181.20500: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204181.20551: stderr chunk (state=3): >>><<< 41016 1727204181.20561: stdout chunk (state=3): >>><<< 41016 1727204181.20593: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 41016 1727204181.20609: handler run complete 41016 1727204181.20748: variable 'ansible_facts' from source: unknown 41016 1727204181.20870: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204181.21206: variable 'ansible_facts' from source: unknown 41016 1727204181.21387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204181.21427: attempt loop complete, returning result 41016 1727204181.21437: _execute() done 41016 1727204181.21443: dumping result to json 41016 1727204181.21478: done dumping result, returning 41016 1727204181.21500: done running TaskExecutor() for managed-node1/TASK: Gathering Facts [028d2410-947f-12d5-0ec4-00000000011b] 41016 1727204181.21511: sending task result for task 028d2410-947f-12d5-0ec4-00000000011b ok: [managed-node1] 41016 1727204181.22746: done sending task result for task 028d2410-947f-12d5-0ec4-00000000011b 41016 1727204181.22750: WORKER PROCESS EXITING 41016 1727204181.22853: no more pending results, returning what we have 41016 1727204181.22856: results queue empty 41016 1727204181.22857: checking for any_errors_fatal 41016 1727204181.22859: done checking for any_errors_fatal 41016 1727204181.22859: checking for max_fail_percentage 41016 1727204181.22861: done checking for max_fail_percentage 41016 1727204181.22862: checking to see if all hosts have failed and the running result is not ok 41016 1727204181.22863: done checking to see if all hosts have failed 41016 1727204181.22864: getting the remaining hosts for this loop 41016 1727204181.22865: done getting the remaining hosts for this loop 41016 1727204181.22869: getting the next task for host managed-node1 41016 1727204181.22877: done getting next task for host managed-node1 41016 1727204181.22879: ^ task is: TASK: meta (flush_handlers) 41016 1727204181.22881: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204181.22885: getting variables 41016 1727204181.22886: in VariableManager get_vars() 41016 1727204181.22920: Calling all_inventory to load vars for managed-node1 41016 1727204181.22923: Calling groups_inventory to load vars for managed-node1 41016 1727204181.22925: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204181.22935: Calling all_plugins_play to load vars for managed-node1 41016 1727204181.22938: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204181.22941: Calling groups_plugins_play to load vars for managed-node1 41016 1727204181.23469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204181.23792: done with get_vars() 41016 1727204181.23804: done getting variables 41016 1727204181.23895: in VariableManager get_vars() 41016 1727204181.23911: Calling all_inventory to load vars for managed-node1 41016 1727204181.23913: Calling groups_inventory to load vars for managed-node1 41016 1727204181.23916: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204181.23920: Calling all_plugins_play to load vars for managed-node1 41016 1727204181.23923: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204181.23925: Calling groups_plugins_play to load vars for managed-node1 41016 1727204181.24345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204181.24729: done with get_vars() 41016 1727204181.24857: done queuing things up, now waiting for results queue to drain 41016 1727204181.24860: results queue empty 41016 1727204181.24861: checking for any_errors_fatal 41016 1727204181.24864: done checking for any_errors_fatal 41016 1727204181.24865: checking for max_fail_percentage 41016 1727204181.24866: done checking for max_fail_percentage 41016 1727204181.24870: checking to see if all hosts have failed and the running result is not ok 41016 1727204181.24871: done checking to see if all hosts have failed 41016 1727204181.24872: getting the remaining hosts for this loop 41016 1727204181.24873: done getting the remaining hosts for this loop 41016 1727204181.24879: getting the next task for host managed-node1 41016 1727204181.24883: done getting next task for host managed-node1 41016 1727204181.24885: ^ task is: TASK: Set type and interface0 41016 1727204181.24887: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204181.24889: getting variables 41016 1727204181.24890: in VariableManager get_vars() 41016 1727204181.24903: Calling all_inventory to load vars for managed-node1 41016 1727204181.24905: Calling groups_inventory to load vars for managed-node1 41016 1727204181.24907: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204181.24912: Calling all_plugins_play to load vars for managed-node1 41016 1727204181.24915: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204181.24918: Calling groups_plugins_play to load vars for managed-node1 41016 1727204181.25172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204181.25416: done with get_vars() 41016 1727204181.25422: done getting variables 41016 1727204181.25456: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set type and interface0] ************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:11 Tuesday 24 September 2024 14:56:21 -0400 (0:00:01.228) 0:00:04.930 ***** 41016 1727204181.25486: entering _queue_task() for managed-node1/set_fact 41016 1727204181.25716: worker is 1 (out of 1 available) 41016 1727204181.25728: exiting _queue_task() for managed-node1/set_fact 41016 1727204181.25740: done queuing things up, now waiting for results queue to drain 41016 1727204181.25741: waiting for pending results... 41016 1727204181.25890: running TaskExecutor() for managed-node1/TASK: Set type and interface0 41016 1727204181.25945: in run() - task 028d2410-947f-12d5-0ec4-00000000000b 41016 1727204181.25956: variable 'ansible_search_path' from source: unknown 41016 1727204181.25989: calling self._execute() 41016 1727204181.26051: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204181.26056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204181.26063: variable 'omit' from source: magic vars 41016 1727204181.26331: variable 'ansible_distribution_major_version' from source: facts 41016 1727204181.26341: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204181.26346: variable 'omit' from source: magic vars 41016 1727204181.26365: variable 'omit' from source: magic vars 41016 1727204181.26388: variable 'type' from source: play vars 41016 1727204181.26445: variable 'type' from source: play vars 41016 1727204181.26461: variable 'interface0' from source: play vars 41016 1727204181.26514: variable 'interface0' from source: play vars 41016 1727204181.26523: variable 'omit' from source: magic vars 41016 1727204181.26551: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204181.26577: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204181.26592: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204181.26604: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204181.26619: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204181.26640: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204181.26643: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204181.26645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204181.26714: Set connection var ansible_shell_executable to /bin/sh 41016 1727204181.26717: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204181.26726: Set connection var ansible_shell_type to sh 41016 1727204181.26728: Set connection var ansible_timeout to 10 41016 1727204181.26730: Set connection var ansible_pipelining to False 41016 1727204181.26738: Set connection var ansible_connection to ssh 41016 1727204181.26757: variable 'ansible_shell_executable' from source: unknown 41016 1727204181.26760: variable 'ansible_connection' from source: unknown 41016 1727204181.26763: variable 'ansible_module_compression' from source: unknown 41016 1727204181.26765: variable 'ansible_shell_type' from source: unknown 41016 1727204181.26767: variable 'ansible_shell_executable' from source: unknown 41016 1727204181.26769: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204181.26771: variable 'ansible_pipelining' from source: unknown 41016 1727204181.26774: variable 'ansible_timeout' from source: unknown 41016 1727204181.26780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204181.26902: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204181.26910: variable 'omit' from source: magic vars 41016 1727204181.26922: starting attempt loop 41016 1727204181.26933: running the handler 41016 1727204181.26957: handler run complete 41016 1727204181.27020: attempt loop complete, returning result 41016 1727204181.27022: _execute() done 41016 1727204181.27025: dumping result to json 41016 1727204181.27027: done dumping result, returning 41016 1727204181.27029: done running TaskExecutor() for managed-node1/TASK: Set type and interface0 [028d2410-947f-12d5-0ec4-00000000000b] 41016 1727204181.27032: sending task result for task 028d2410-947f-12d5-0ec4-00000000000b ok: [managed-node1] => { "ansible_facts": { "interface": "ethtest0", "type": "veth" }, "changed": false } 41016 1727204181.27189: no more pending results, returning what we have 41016 1727204181.27192: results queue empty 41016 1727204181.27193: checking for any_errors_fatal 41016 1727204181.27194: done checking for any_errors_fatal 41016 1727204181.27195: checking for max_fail_percentage 41016 1727204181.27196: done checking for max_fail_percentage 41016 1727204181.27197: checking to see if all hosts have failed and the running result is not ok 41016 1727204181.27198: done checking to see if all hosts have failed 41016 1727204181.27199: getting the remaining hosts for this loop 41016 1727204181.27200: done getting the remaining hosts for this loop 41016 1727204181.27203: getting the next task for host managed-node1 41016 1727204181.27211: done getting next task for host managed-node1 41016 1727204181.27213: ^ task is: TASK: Show interfaces 41016 1727204181.27215: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204181.27218: getting variables 41016 1727204181.27220: in VariableManager get_vars() 41016 1727204181.27254: Calling all_inventory to load vars for managed-node1 41016 1727204181.27257: Calling groups_inventory to load vars for managed-node1 41016 1727204181.27259: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204181.27267: Calling all_plugins_play to load vars for managed-node1 41016 1727204181.27270: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204181.27272: Calling groups_plugins_play to load vars for managed-node1 41016 1727204181.27625: done sending task result for task 028d2410-947f-12d5-0ec4-00000000000b 41016 1727204181.27629: WORKER PROCESS EXITING 41016 1727204181.27790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204181.28207: done with get_vars() 41016 1727204181.28217: done getting variables TASK [Show interfaces] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:15 Tuesday 24 September 2024 14:56:21 -0400 (0:00:00.028) 0:00:04.959 ***** 41016 1727204181.28329: entering _queue_task() for managed-node1/include_tasks 41016 1727204181.28619: worker is 1 (out of 1 available) 41016 1727204181.28633: exiting _queue_task() for managed-node1/include_tasks 41016 1727204181.28646: done queuing things up, now waiting for results queue to drain 41016 1727204181.28647: waiting for pending results... 41016 1727204181.28803: running TaskExecutor() for managed-node1/TASK: Show interfaces 41016 1727204181.28861: in run() - task 028d2410-947f-12d5-0ec4-00000000000c 41016 1727204181.28871: variable 'ansible_search_path' from source: unknown 41016 1727204181.28901: calling self._execute() 41016 1727204181.28966: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204181.28969: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204181.28981: variable 'omit' from source: magic vars 41016 1727204181.29244: variable 'ansible_distribution_major_version' from source: facts 41016 1727204181.29254: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204181.29263: _execute() done 41016 1727204181.29269: dumping result to json 41016 1727204181.29272: done dumping result, returning 41016 1727204181.29279: done running TaskExecutor() for managed-node1/TASK: Show interfaces [028d2410-947f-12d5-0ec4-00000000000c] 41016 1727204181.29284: sending task result for task 028d2410-947f-12d5-0ec4-00000000000c 41016 1727204181.29368: done sending task result for task 028d2410-947f-12d5-0ec4-00000000000c 41016 1727204181.29371: WORKER PROCESS EXITING 41016 1727204181.29433: no more pending results, returning what we have 41016 1727204181.29437: in VariableManager get_vars() 41016 1727204181.29471: Calling all_inventory to load vars for managed-node1 41016 1727204181.29473: Calling groups_inventory to load vars for managed-node1 41016 1727204181.29477: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204181.29487: Calling all_plugins_play to load vars for managed-node1 41016 1727204181.29489: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204181.29492: Calling groups_plugins_play to load vars for managed-node1 41016 1727204181.29602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204181.29715: done with get_vars() 41016 1727204181.29722: variable 'ansible_search_path' from source: unknown 41016 1727204181.29732: we have included files to process 41016 1727204181.29733: generating all_blocks data 41016 1727204181.29734: done generating all_blocks data 41016 1727204181.29735: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 41016 1727204181.29735: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 41016 1727204181.29737: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 41016 1727204181.29838: in VariableManager get_vars() 41016 1727204181.29853: done with get_vars() 41016 1727204181.29925: done processing included file 41016 1727204181.29926: iterating over new_blocks loaded from include file 41016 1727204181.29928: in VariableManager get_vars() 41016 1727204181.29939: done with get_vars() 41016 1727204181.29940: filtering new block on tags 41016 1727204181.29952: done filtering new block on tags 41016 1727204181.29954: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node1 41016 1727204181.29957: extending task lists for all hosts with included blocks 41016 1727204181.30041: done extending task lists 41016 1727204181.30042: done processing included files 41016 1727204181.30043: results queue empty 41016 1727204181.30043: checking for any_errors_fatal 41016 1727204181.30045: done checking for any_errors_fatal 41016 1727204181.30046: checking for max_fail_percentage 41016 1727204181.30047: done checking for max_fail_percentage 41016 1727204181.30048: checking to see if all hosts have failed and the running result is not ok 41016 1727204181.30048: done checking to see if all hosts have failed 41016 1727204181.30049: getting the remaining hosts for this loop 41016 1727204181.30050: done getting the remaining hosts for this loop 41016 1727204181.30054: getting the next task for host managed-node1 41016 1727204181.30058: done getting next task for host managed-node1 41016 1727204181.30064: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 41016 1727204181.30066: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204181.30068: getting variables 41016 1727204181.30069: in VariableManager get_vars() 41016 1727204181.30083: Calling all_inventory to load vars for managed-node1 41016 1727204181.30085: Calling groups_inventory to load vars for managed-node1 41016 1727204181.30087: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204181.30092: Calling all_plugins_play to load vars for managed-node1 41016 1727204181.30093: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204181.30095: Calling groups_plugins_play to load vars for managed-node1 41016 1727204181.30246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204181.30421: done with get_vars() 41016 1727204181.30431: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 14:56:21 -0400 (0:00:00.021) 0:00:04.981 ***** 41016 1727204181.30497: entering _queue_task() for managed-node1/include_tasks 41016 1727204181.30754: worker is 1 (out of 1 available) 41016 1727204181.30766: exiting _queue_task() for managed-node1/include_tasks 41016 1727204181.30981: done queuing things up, now waiting for results queue to drain 41016 1727204181.30983: waiting for pending results... 41016 1727204181.31115: running TaskExecutor() for managed-node1/TASK: Include the task 'get_current_interfaces.yml' 41016 1727204181.31150: in run() - task 028d2410-947f-12d5-0ec4-000000000135 41016 1727204181.31170: variable 'ansible_search_path' from source: unknown 41016 1727204181.31183: variable 'ansible_search_path' from source: unknown 41016 1727204181.31231: calling self._execute() 41016 1727204181.31324: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204181.31336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204181.31349: variable 'omit' from source: magic vars 41016 1727204181.31743: variable 'ansible_distribution_major_version' from source: facts 41016 1727204181.31767: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204181.31781: _execute() done 41016 1727204181.31862: dumping result to json 41016 1727204181.31865: done dumping result, returning 41016 1727204181.31868: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_current_interfaces.yml' [028d2410-947f-12d5-0ec4-000000000135] 41016 1727204181.31870: sending task result for task 028d2410-947f-12d5-0ec4-000000000135 41016 1727204181.31942: done sending task result for task 028d2410-947f-12d5-0ec4-000000000135 41016 1727204181.31945: WORKER PROCESS EXITING 41016 1727204181.31994: no more pending results, returning what we have 41016 1727204181.32005: in VariableManager get_vars() 41016 1727204181.32061: Calling all_inventory to load vars for managed-node1 41016 1727204181.32064: Calling groups_inventory to load vars for managed-node1 41016 1727204181.32067: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204181.32082: Calling all_plugins_play to load vars for managed-node1 41016 1727204181.32086: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204181.32089: Calling groups_plugins_play to load vars for managed-node1 41016 1727204181.32498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204181.32698: done with get_vars() 41016 1727204181.32707: variable 'ansible_search_path' from source: unknown 41016 1727204181.32710: variable 'ansible_search_path' from source: unknown 41016 1727204181.32748: we have included files to process 41016 1727204181.32749: generating all_blocks data 41016 1727204181.32751: done generating all_blocks data 41016 1727204181.32752: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 41016 1727204181.32753: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 41016 1727204181.32755: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 41016 1727204181.33100: done processing included file 41016 1727204181.33104: iterating over new_blocks loaded from include file 41016 1727204181.33105: in VariableManager get_vars() 41016 1727204181.33126: done with get_vars() 41016 1727204181.33128: filtering new block on tags 41016 1727204181.33145: done filtering new block on tags 41016 1727204181.33147: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node1 41016 1727204181.33151: extending task lists for all hosts with included blocks 41016 1727204181.33253: done extending task lists 41016 1727204181.33254: done processing included files 41016 1727204181.33255: results queue empty 41016 1727204181.33256: checking for any_errors_fatal 41016 1727204181.33259: done checking for any_errors_fatal 41016 1727204181.33260: checking for max_fail_percentage 41016 1727204181.33261: done checking for max_fail_percentage 41016 1727204181.33261: checking to see if all hosts have failed and the running result is not ok 41016 1727204181.33262: done checking to see if all hosts have failed 41016 1727204181.33263: getting the remaining hosts for this loop 41016 1727204181.33264: done getting the remaining hosts for this loop 41016 1727204181.33266: getting the next task for host managed-node1 41016 1727204181.33270: done getting next task for host managed-node1 41016 1727204181.33272: ^ task is: TASK: Gather current interface info 41016 1727204181.33274: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204181.33278: getting variables 41016 1727204181.33279: in VariableManager get_vars() 41016 1727204181.33291: Calling all_inventory to load vars for managed-node1 41016 1727204181.33293: Calling groups_inventory to load vars for managed-node1 41016 1727204181.33295: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204181.33300: Calling all_plugins_play to load vars for managed-node1 41016 1727204181.33302: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204181.33305: Calling groups_plugins_play to load vars for managed-node1 41016 1727204181.33444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204181.33631: done with get_vars() 41016 1727204181.33639: done getting variables 41016 1727204181.33677: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 14:56:21 -0400 (0:00:00.032) 0:00:05.013 ***** 41016 1727204181.33704: entering _queue_task() for managed-node1/command 41016 1727204181.34204: worker is 1 (out of 1 available) 41016 1727204181.34213: exiting _queue_task() for managed-node1/command 41016 1727204181.34223: done queuing things up, now waiting for results queue to drain 41016 1727204181.34225: waiting for pending results... 41016 1727204181.34351: running TaskExecutor() for managed-node1/TASK: Gather current interface info 41016 1727204181.34357: in run() - task 028d2410-947f-12d5-0ec4-00000000014e 41016 1727204181.34379: variable 'ansible_search_path' from source: unknown 41016 1727204181.34387: variable 'ansible_search_path' from source: unknown 41016 1727204181.34427: calling self._execute() 41016 1727204181.34512: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204181.34523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204181.34536: variable 'omit' from source: magic vars 41016 1727204181.34929: variable 'ansible_distribution_major_version' from source: facts 41016 1727204181.34946: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204181.34956: variable 'omit' from source: magic vars 41016 1727204181.35006: variable 'omit' from source: magic vars 41016 1727204181.35044: variable 'omit' from source: magic vars 41016 1727204181.35099: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204181.35207: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204181.35210: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204181.35213: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204181.35215: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204181.35221: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204181.35228: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204181.35235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204181.35338: Set connection var ansible_shell_executable to /bin/sh 41016 1727204181.35349: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204181.35359: Set connection var ansible_shell_type to sh 41016 1727204181.35369: Set connection var ansible_timeout to 10 41016 1727204181.35381: Set connection var ansible_pipelining to False 41016 1727204181.35392: Set connection var ansible_connection to ssh 41016 1727204181.35416: variable 'ansible_shell_executable' from source: unknown 41016 1727204181.35428: variable 'ansible_connection' from source: unknown 41016 1727204181.35437: variable 'ansible_module_compression' from source: unknown 41016 1727204181.35444: variable 'ansible_shell_type' from source: unknown 41016 1727204181.35452: variable 'ansible_shell_executable' from source: unknown 41016 1727204181.35459: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204181.35533: variable 'ansible_pipelining' from source: unknown 41016 1727204181.35536: variable 'ansible_timeout' from source: unknown 41016 1727204181.35538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204181.35621: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204181.35643: variable 'omit' from source: magic vars 41016 1727204181.35655: starting attempt loop 41016 1727204181.35662: running the handler 41016 1727204181.35685: _low_level_execute_command(): starting 41016 1727204181.35700: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204181.36497: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204181.36541: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204181.36561: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204181.36573: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204181.36707: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 41016 1727204181.39196: stdout chunk (state=3): >>>/root <<< 41016 1727204181.39414: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204181.39418: stdout chunk (state=3): >>><<< 41016 1727204181.39421: stderr chunk (state=3): >>><<< 41016 1727204181.39542: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 41016 1727204181.39546: _low_level_execute_command(): starting 41016 1727204181.39549: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204181.3944643-41490-272483145175914 `" && echo ansible-tmp-1727204181.3944643-41490-272483145175914="` echo /root/.ansible/tmp/ansible-tmp-1727204181.3944643-41490-272483145175914 `" ) && sleep 0' 41016 1727204181.40141: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204181.40156: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204181.40173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204181.40199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204181.40234: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204181.40342: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204181.40388: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204181.40468: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 41016 1727204181.43409: stdout chunk (state=3): >>>ansible-tmp-1727204181.3944643-41490-272483145175914=/root/.ansible/tmp/ansible-tmp-1727204181.3944643-41490-272483145175914 <<< 41016 1727204181.43624: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204181.43627: stdout chunk (state=3): >>><<< 41016 1727204181.43680: stderr chunk (state=3): >>><<< 41016 1727204181.43684: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204181.3944643-41490-272483145175914=/root/.ansible/tmp/ansible-tmp-1727204181.3944643-41490-272483145175914 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 41016 1727204181.43692: variable 'ansible_module_compression' from source: unknown 41016 1727204181.43749: ANSIBALLZ: Using generic lock for ansible.legacy.command 41016 1727204181.43752: ANSIBALLZ: Acquiring lock 41016 1727204181.43755: ANSIBALLZ: Lock acquired: 140580610774160 41016 1727204181.43759: ANSIBALLZ: Creating module 41016 1727204181.63384: ANSIBALLZ: Writing module into payload 41016 1727204181.63406: ANSIBALLZ: Writing module 41016 1727204181.63542: ANSIBALLZ: Renaming module 41016 1727204181.63554: ANSIBALLZ: Done creating module 41016 1727204181.63595: variable 'ansible_facts' from source: unknown 41016 1727204181.63791: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204181.3944643-41490-272483145175914/AnsiballZ_command.py 41016 1727204181.63868: Sending initial data 41016 1727204181.63962: Sent initial data (156 bytes) 41016 1727204181.64961: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204181.64980: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204181.65084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204181.65113: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204181.65134: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204181.65172: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204181.65304: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 41016 1727204181.67891: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204181.68034: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204181.68070: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmpn3jluqwq /root/.ansible/tmp/ansible-tmp-1727204181.3944643-41490-272483145175914/AnsiballZ_command.py <<< 41016 1727204181.68077: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204181.3944643-41490-272483145175914/AnsiballZ_command.py" <<< 41016 1727204181.68180: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmpn3jluqwq" to remote "/root/.ansible/tmp/ansible-tmp-1727204181.3944643-41490-272483145175914/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204181.3944643-41490-272483145175914/AnsiballZ_command.py" <<< 41016 1727204181.69992: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204181.70201: stderr chunk (state=3): >>><<< 41016 1727204181.70205: stdout chunk (state=3): >>><<< 41016 1727204181.70227: done transferring module to remote 41016 1727204181.70238: _low_level_execute_command(): starting 41016 1727204181.70243: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204181.3944643-41490-272483145175914/ /root/.ansible/tmp/ansible-tmp-1727204181.3944643-41490-272483145175914/AnsiballZ_command.py && sleep 0' 41016 1727204181.71503: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204181.71681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204181.71926: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204181.71989: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204181.74005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204181.74009: stdout chunk (state=3): >>><<< 41016 1727204181.74012: stderr chunk (state=3): >>><<< 41016 1727204181.74115: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204181.74119: _low_level_execute_command(): starting 41016 1727204181.74130: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204181.3944643-41490-272483145175914/AnsiballZ_command.py && sleep 0' 41016 1727204181.75338: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204181.75573: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204181.75596: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204181.75615: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204181.75792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204181.92440: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:56:21.918854", "end": "2024-09-24 14:56:21.922438", "delta": "0:00:00.003584", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41016 1727204181.94298: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 41016 1727204181.94361: stderr chunk (state=3): >>><<< 41016 1727204181.94416: stdout chunk (state=3): >>><<< 41016 1727204181.94439: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:56:21.918854", "end": "2024-09-24 14:56:21.922438", "delta": "0:00:00.003584", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204181.94492: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204181.3944643-41490-272483145175914/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204181.94606: _low_level_execute_command(): starting 41016 1727204181.94612: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204181.3944643-41490-272483145175914/ > /dev/null 2>&1 && sleep 0' 41016 1727204181.95268: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204181.95299: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204181.95407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204181.95486: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204181.95566: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204181.97549: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204181.97671: stderr chunk (state=3): >>><<< 41016 1727204181.97675: stdout chunk (state=3): >>><<< 41016 1727204181.97762: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204181.97766: handler run complete 41016 1727204181.97825: Evaluated conditional (False): False 41016 1727204181.97843: attempt loop complete, returning result 41016 1727204181.97851: _execute() done 41016 1727204181.97990: dumping result to json 41016 1727204181.97993: done dumping result, returning 41016 1727204181.97995: done running TaskExecutor() for managed-node1/TASK: Gather current interface info [028d2410-947f-12d5-0ec4-00000000014e] 41016 1727204181.97998: sending task result for task 028d2410-947f-12d5-0ec4-00000000014e ok: [managed-node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003584", "end": "2024-09-24 14:56:21.922438", "rc": 0, "start": "2024-09-24 14:56:21.918854" } STDOUT: bonding_masters eth0 lo 41016 1727204181.98260: no more pending results, returning what we have 41016 1727204181.98264: results queue empty 41016 1727204181.98265: checking for any_errors_fatal 41016 1727204181.98267: done checking for any_errors_fatal 41016 1727204181.98268: checking for max_fail_percentage 41016 1727204181.98270: done checking for max_fail_percentage 41016 1727204181.98271: checking to see if all hosts have failed and the running result is not ok 41016 1727204181.98271: done checking to see if all hosts have failed 41016 1727204181.98272: getting the remaining hosts for this loop 41016 1727204181.98273: done getting the remaining hosts for this loop 41016 1727204181.98279: getting the next task for host managed-node1 41016 1727204181.98291: done getting next task for host managed-node1 41016 1727204181.98294: ^ task is: TASK: Set current_interfaces 41016 1727204181.98298: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204181.98302: getting variables 41016 1727204181.98303: in VariableManager get_vars() 41016 1727204181.98346: Calling all_inventory to load vars for managed-node1 41016 1727204181.98354: Calling groups_inventory to load vars for managed-node1 41016 1727204181.98358: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204181.98369: Calling all_plugins_play to load vars for managed-node1 41016 1727204181.98601: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204181.98612: Calling groups_plugins_play to load vars for managed-node1 41016 1727204181.99414: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204181.99996: done with get_vars() 41016 1727204182.00016: done getting variables 41016 1727204182.00073: done sending task result for task 028d2410-947f-12d5-0ec4-00000000014e 41016 1727204182.00083: WORKER PROCESS EXITING 41016 1727204182.00227: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 14:56:22 -0400 (0:00:00.665) 0:00:05.678 ***** 41016 1727204182.00262: entering _queue_task() for managed-node1/set_fact 41016 1727204182.00984: worker is 1 (out of 1 available) 41016 1727204182.01079: exiting _queue_task() for managed-node1/set_fact 41016 1727204182.01090: done queuing things up, now waiting for results queue to drain 41016 1727204182.01092: waiting for pending results... 41016 1727204182.01456: running TaskExecutor() for managed-node1/TASK: Set current_interfaces 41016 1727204182.01635: in run() - task 028d2410-947f-12d5-0ec4-00000000014f 41016 1727204182.01653: variable 'ansible_search_path' from source: unknown 41016 1727204182.01657: variable 'ansible_search_path' from source: unknown 41016 1727204182.01744: calling self._execute() 41016 1727204182.02045: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204182.02051: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204182.02060: variable 'omit' from source: magic vars 41016 1727204182.02929: variable 'ansible_distribution_major_version' from source: facts 41016 1727204182.02952: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204182.02958: variable 'omit' from source: magic vars 41016 1727204182.03004: variable 'omit' from source: magic vars 41016 1727204182.03132: variable '_current_interfaces' from source: set_fact 41016 1727204182.03204: variable 'omit' from source: magic vars 41016 1727204182.03248: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204182.03296: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204182.03329: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204182.03378: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204182.03395: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204182.03428: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204182.03440: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204182.03454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204182.03585: Set connection var ansible_shell_executable to /bin/sh 41016 1727204182.03598: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204182.03612: Set connection var ansible_shell_type to sh 41016 1727204182.03625: Set connection var ansible_timeout to 10 41016 1727204182.03635: Set connection var ansible_pipelining to False 41016 1727204182.03658: Set connection var ansible_connection to ssh 41016 1727204182.03693: variable 'ansible_shell_executable' from source: unknown 41016 1727204182.03702: variable 'ansible_connection' from source: unknown 41016 1727204182.03711: variable 'ansible_module_compression' from source: unknown 41016 1727204182.03752: variable 'ansible_shell_type' from source: unknown 41016 1727204182.03756: variable 'ansible_shell_executable' from source: unknown 41016 1727204182.03761: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204182.03764: variable 'ansible_pipelining' from source: unknown 41016 1727204182.03766: variable 'ansible_timeout' from source: unknown 41016 1727204182.03768: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204182.04021: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204182.04025: variable 'omit' from source: magic vars 41016 1727204182.04031: starting attempt loop 41016 1727204182.04039: running the handler 41016 1727204182.04058: handler run complete 41016 1727204182.04079: attempt loop complete, returning result 41016 1727204182.04134: _execute() done 41016 1727204182.04138: dumping result to json 41016 1727204182.04140: done dumping result, returning 41016 1727204182.04143: done running TaskExecutor() for managed-node1/TASK: Set current_interfaces [028d2410-947f-12d5-0ec4-00000000014f] 41016 1727204182.04145: sending task result for task 028d2410-947f-12d5-0ec4-00000000014f ok: [managed-node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 41016 1727204182.04439: no more pending results, returning what we have 41016 1727204182.04443: results queue empty 41016 1727204182.04444: checking for any_errors_fatal 41016 1727204182.04566: done checking for any_errors_fatal 41016 1727204182.04568: checking for max_fail_percentage 41016 1727204182.04570: done checking for max_fail_percentage 41016 1727204182.04571: checking to see if all hosts have failed and the running result is not ok 41016 1727204182.04572: done checking to see if all hosts have failed 41016 1727204182.04572: getting the remaining hosts for this loop 41016 1727204182.04574: done getting the remaining hosts for this loop 41016 1727204182.04580: getting the next task for host managed-node1 41016 1727204182.04588: done getting next task for host managed-node1 41016 1727204182.04591: ^ task is: TASK: Show current_interfaces 41016 1727204182.04593: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204182.04596: getting variables 41016 1727204182.04598: in VariableManager get_vars() 41016 1727204182.04640: Calling all_inventory to load vars for managed-node1 41016 1727204182.04643: Calling groups_inventory to load vars for managed-node1 41016 1727204182.04648: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204182.04716: done sending task result for task 028d2410-947f-12d5-0ec4-00000000014f 41016 1727204182.04719: WORKER PROCESS EXITING 41016 1727204182.04730: Calling all_plugins_play to load vars for managed-node1 41016 1727204182.04737: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204182.04741: Calling groups_plugins_play to load vars for managed-node1 41016 1727204182.05053: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204182.05340: done with get_vars() 41016 1727204182.05351: done getting variables 41016 1727204182.05459: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 14:56:22 -0400 (0:00:00.052) 0:00:05.731 ***** 41016 1727204182.05502: entering _queue_task() for managed-node1/debug 41016 1727204182.05504: Creating lock for debug 41016 1727204182.05795: worker is 1 (out of 1 available) 41016 1727204182.05811: exiting _queue_task() for managed-node1/debug 41016 1727204182.05823: done queuing things up, now waiting for results queue to drain 41016 1727204182.05825: waiting for pending results... 41016 1727204182.06033: running TaskExecutor() for managed-node1/TASK: Show current_interfaces 41016 1727204182.06127: in run() - task 028d2410-947f-12d5-0ec4-000000000136 41016 1727204182.06147: variable 'ansible_search_path' from source: unknown 41016 1727204182.06155: variable 'ansible_search_path' from source: unknown 41016 1727204182.06196: calling self._execute() 41016 1727204182.06278: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204182.06291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204182.06307: variable 'omit' from source: magic vars 41016 1727204182.06793: variable 'ansible_distribution_major_version' from source: facts 41016 1727204182.06798: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204182.06801: variable 'omit' from source: magic vars 41016 1727204182.06843: variable 'omit' from source: magic vars 41016 1727204182.06979: variable 'current_interfaces' from source: set_fact 41016 1727204182.07011: variable 'omit' from source: magic vars 41016 1727204182.07074: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204182.07280: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204182.07285: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204182.07292: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204182.07295: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204182.07298: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204182.07301: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204182.07304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204182.07407: Set connection var ansible_shell_executable to /bin/sh 41016 1727204182.07418: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204182.07428: Set connection var ansible_shell_type to sh 41016 1727204182.07437: Set connection var ansible_timeout to 10 41016 1727204182.07468: Set connection var ansible_pipelining to False 41016 1727204182.07482: Set connection var ansible_connection to ssh 41016 1727204182.07507: variable 'ansible_shell_executable' from source: unknown 41016 1727204182.07515: variable 'ansible_connection' from source: unknown 41016 1727204182.07522: variable 'ansible_module_compression' from source: unknown 41016 1727204182.07530: variable 'ansible_shell_type' from source: unknown 41016 1727204182.07560: variable 'ansible_shell_executable' from source: unknown 41016 1727204182.07569: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204182.07578: variable 'ansible_pipelining' from source: unknown 41016 1727204182.07585: variable 'ansible_timeout' from source: unknown 41016 1727204182.07593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204182.07724: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204182.07746: variable 'omit' from source: magic vars 41016 1727204182.07773: starting attempt loop 41016 1727204182.07783: running the handler 41016 1727204182.07830: handler run complete 41016 1727204182.07847: attempt loop complete, returning result 41016 1727204182.07854: _execute() done 41016 1727204182.07860: dumping result to json 41016 1727204182.07868: done dumping result, returning 41016 1727204182.07940: done running TaskExecutor() for managed-node1/TASK: Show current_interfaces [028d2410-947f-12d5-0ec4-000000000136] 41016 1727204182.07944: sending task result for task 028d2410-947f-12d5-0ec4-000000000136 41016 1727204182.08019: done sending task result for task 028d2410-947f-12d5-0ec4-000000000136 41016 1727204182.08023: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 41016 1727204182.08088: no more pending results, returning what we have 41016 1727204182.08093: results queue empty 41016 1727204182.08094: checking for any_errors_fatal 41016 1727204182.08100: done checking for any_errors_fatal 41016 1727204182.08101: checking for max_fail_percentage 41016 1727204182.08102: done checking for max_fail_percentage 41016 1727204182.08103: checking to see if all hosts have failed and the running result is not ok 41016 1727204182.08105: done checking to see if all hosts have failed 41016 1727204182.08105: getting the remaining hosts for this loop 41016 1727204182.08107: done getting the remaining hosts for this loop 41016 1727204182.08113: getting the next task for host managed-node1 41016 1727204182.08120: done getting next task for host managed-node1 41016 1727204182.08123: ^ task is: TASK: Manage test interface 41016 1727204182.08124: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204182.08127: getting variables 41016 1727204182.08129: in VariableManager get_vars() 41016 1727204182.08285: Calling all_inventory to load vars for managed-node1 41016 1727204182.08289: Calling groups_inventory to load vars for managed-node1 41016 1727204182.08292: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204182.08381: Calling all_plugins_play to load vars for managed-node1 41016 1727204182.08386: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204182.08389: Calling groups_plugins_play to load vars for managed-node1 41016 1727204182.08607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204182.08837: done with get_vars() 41016 1727204182.08846: done getting variables TASK [Manage test interface] *************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:17 Tuesday 24 September 2024 14:56:22 -0400 (0:00:00.034) 0:00:05.765 ***** 41016 1727204182.08937: entering _queue_task() for managed-node1/include_tasks 41016 1727204182.09219: worker is 1 (out of 1 available) 41016 1727204182.09230: exiting _queue_task() for managed-node1/include_tasks 41016 1727204182.09244: done queuing things up, now waiting for results queue to drain 41016 1727204182.09245: waiting for pending results... 41016 1727204182.09690: running TaskExecutor() for managed-node1/TASK: Manage test interface 41016 1727204182.09695: in run() - task 028d2410-947f-12d5-0ec4-00000000000d 41016 1727204182.09748: variable 'ansible_search_path' from source: unknown 41016 1727204182.09791: calling self._execute() 41016 1727204182.09899: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204182.09915: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204182.09947: variable 'omit' from source: magic vars 41016 1727204182.10401: variable 'ansible_distribution_major_version' from source: facts 41016 1727204182.10421: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204182.10479: _execute() done 41016 1727204182.10483: dumping result to json 41016 1727204182.10486: done dumping result, returning 41016 1727204182.10493: done running TaskExecutor() for managed-node1/TASK: Manage test interface [028d2410-947f-12d5-0ec4-00000000000d] 41016 1727204182.10496: sending task result for task 028d2410-947f-12d5-0ec4-00000000000d 41016 1727204182.10715: done sending task result for task 028d2410-947f-12d5-0ec4-00000000000d 41016 1727204182.10718: WORKER PROCESS EXITING 41016 1727204182.10765: no more pending results, returning what we have 41016 1727204182.10772: in VariableManager get_vars() 41016 1727204182.10830: Calling all_inventory to load vars for managed-node1 41016 1727204182.10833: Calling groups_inventory to load vars for managed-node1 41016 1727204182.10836: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204182.10907: Calling all_plugins_play to load vars for managed-node1 41016 1727204182.10913: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204182.10917: Calling groups_plugins_play to load vars for managed-node1 41016 1727204182.11273: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204182.11528: done with get_vars() 41016 1727204182.11535: variable 'ansible_search_path' from source: unknown 41016 1727204182.11547: we have included files to process 41016 1727204182.11549: generating all_blocks data 41016 1727204182.11550: done generating all_blocks data 41016 1727204182.11554: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 41016 1727204182.11556: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 41016 1727204182.11558: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 41016 1727204182.12200: in VariableManager get_vars() 41016 1727204182.12224: done with get_vars() 41016 1727204182.12718: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 41016 1727204182.13333: done processing included file 41016 1727204182.13335: iterating over new_blocks loaded from include file 41016 1727204182.13336: in VariableManager get_vars() 41016 1727204182.13362: done with get_vars() 41016 1727204182.13364: filtering new block on tags 41016 1727204182.13395: done filtering new block on tags 41016 1727204182.13398: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed-node1 41016 1727204182.13403: extending task lists for all hosts with included blocks 41016 1727204182.13662: done extending task lists 41016 1727204182.13663: done processing included files 41016 1727204182.13664: results queue empty 41016 1727204182.13665: checking for any_errors_fatal 41016 1727204182.13667: done checking for any_errors_fatal 41016 1727204182.13668: checking for max_fail_percentage 41016 1727204182.13669: done checking for max_fail_percentage 41016 1727204182.13670: checking to see if all hosts have failed and the running result is not ok 41016 1727204182.13671: done checking to see if all hosts have failed 41016 1727204182.13671: getting the remaining hosts for this loop 41016 1727204182.13672: done getting the remaining hosts for this loop 41016 1727204182.13682: getting the next task for host managed-node1 41016 1727204182.13686: done getting next task for host managed-node1 41016 1727204182.13688: ^ task is: TASK: Ensure state in ["present", "absent"] 41016 1727204182.13690: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204182.13692: getting variables 41016 1727204182.13693: in VariableManager get_vars() 41016 1727204182.13705: Calling all_inventory to load vars for managed-node1 41016 1727204182.13707: Calling groups_inventory to load vars for managed-node1 41016 1727204182.13712: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204182.13718: Calling all_plugins_play to load vars for managed-node1 41016 1727204182.13720: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204182.13723: Calling groups_plugins_play to load vars for managed-node1 41016 1727204182.13864: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204182.14062: done with get_vars() 41016 1727204182.14070: done getting variables 41016 1727204182.14140: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Tuesday 24 September 2024 14:56:22 -0400 (0:00:00.052) 0:00:05.817 ***** 41016 1727204182.14165: entering _queue_task() for managed-node1/fail 41016 1727204182.14167: Creating lock for fail 41016 1727204182.14502: worker is 1 (out of 1 available) 41016 1727204182.14516: exiting _queue_task() for managed-node1/fail 41016 1727204182.14527: done queuing things up, now waiting for results queue to drain 41016 1727204182.14529: waiting for pending results... 41016 1727204182.14992: running TaskExecutor() for managed-node1/TASK: Ensure state in ["present", "absent"] 41016 1727204182.15002: in run() - task 028d2410-947f-12d5-0ec4-00000000016a 41016 1727204182.15006: variable 'ansible_search_path' from source: unknown 41016 1727204182.15009: variable 'ansible_search_path' from source: unknown 41016 1727204182.15011: calling self._execute() 41016 1727204182.15049: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204182.15056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204182.15065: variable 'omit' from source: magic vars 41016 1727204182.15477: variable 'ansible_distribution_major_version' from source: facts 41016 1727204182.15490: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204182.15649: variable 'state' from source: include params 41016 1727204182.15655: Evaluated conditional (state not in ["present", "absent"]): False 41016 1727204182.15658: when evaluation is False, skipping this task 41016 1727204182.15661: _execute() done 41016 1727204182.15669: dumping result to json 41016 1727204182.15672: done dumping result, returning 41016 1727204182.15680: done running TaskExecutor() for managed-node1/TASK: Ensure state in ["present", "absent"] [028d2410-947f-12d5-0ec4-00000000016a] 41016 1727204182.15686: sending task result for task 028d2410-947f-12d5-0ec4-00000000016a 41016 1727204182.15767: done sending task result for task 028d2410-947f-12d5-0ec4-00000000016a 41016 1727204182.15771: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 41016 1727204182.15980: no more pending results, returning what we have 41016 1727204182.15983: results queue empty 41016 1727204182.15985: checking for any_errors_fatal 41016 1727204182.15986: done checking for any_errors_fatal 41016 1727204182.15987: checking for max_fail_percentage 41016 1727204182.15988: done checking for max_fail_percentage 41016 1727204182.15989: checking to see if all hosts have failed and the running result is not ok 41016 1727204182.15990: done checking to see if all hosts have failed 41016 1727204182.15991: getting the remaining hosts for this loop 41016 1727204182.15992: done getting the remaining hosts for this loop 41016 1727204182.15995: getting the next task for host managed-node1 41016 1727204182.16000: done getting next task for host managed-node1 41016 1727204182.16003: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 41016 1727204182.16006: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204182.16011: getting variables 41016 1727204182.16013: in VariableManager get_vars() 41016 1727204182.16043: Calling all_inventory to load vars for managed-node1 41016 1727204182.16045: Calling groups_inventory to load vars for managed-node1 41016 1727204182.16048: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204182.16057: Calling all_plugins_play to load vars for managed-node1 41016 1727204182.16061: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204182.16064: Calling groups_plugins_play to load vars for managed-node1 41016 1727204182.16340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204182.16566: done with get_vars() 41016 1727204182.16577: done getting variables 41016 1727204182.16640: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Tuesday 24 September 2024 14:56:22 -0400 (0:00:00.024) 0:00:05.842 ***** 41016 1727204182.16665: entering _queue_task() for managed-node1/fail 41016 1727204182.16935: worker is 1 (out of 1 available) 41016 1727204182.17062: exiting _queue_task() for managed-node1/fail 41016 1727204182.17072: done queuing things up, now waiting for results queue to drain 41016 1727204182.17073: waiting for pending results... 41016 1727204182.17233: running TaskExecutor() for managed-node1/TASK: Ensure type in ["dummy", "tap", "veth"] 41016 1727204182.17312: in run() - task 028d2410-947f-12d5-0ec4-00000000016b 41016 1727204182.17381: variable 'ansible_search_path' from source: unknown 41016 1727204182.17386: variable 'ansible_search_path' from source: unknown 41016 1727204182.17388: calling self._execute() 41016 1727204182.17454: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204182.17459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204182.17470: variable 'omit' from source: magic vars 41016 1727204182.18110: variable 'ansible_distribution_major_version' from source: facts 41016 1727204182.18114: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204182.18117: variable 'type' from source: set_fact 41016 1727204182.18120: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 41016 1727204182.18122: when evaluation is False, skipping this task 41016 1727204182.18124: _execute() done 41016 1727204182.18127: dumping result to json 41016 1727204182.18129: done dumping result, returning 41016 1727204182.18131: done running TaskExecutor() for managed-node1/TASK: Ensure type in ["dummy", "tap", "veth"] [028d2410-947f-12d5-0ec4-00000000016b] 41016 1727204182.18137: sending task result for task 028d2410-947f-12d5-0ec4-00000000016b 41016 1727204182.18289: done sending task result for task 028d2410-947f-12d5-0ec4-00000000016b 41016 1727204182.18294: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 41016 1727204182.18394: no more pending results, returning what we have 41016 1727204182.18399: results queue empty 41016 1727204182.18400: checking for any_errors_fatal 41016 1727204182.18410: done checking for any_errors_fatal 41016 1727204182.18411: checking for max_fail_percentage 41016 1727204182.18413: done checking for max_fail_percentage 41016 1727204182.18413: checking to see if all hosts have failed and the running result is not ok 41016 1727204182.18414: done checking to see if all hosts have failed 41016 1727204182.18415: getting the remaining hosts for this loop 41016 1727204182.18416: done getting the remaining hosts for this loop 41016 1727204182.18420: getting the next task for host managed-node1 41016 1727204182.18428: done getting next task for host managed-node1 41016 1727204182.18431: ^ task is: TASK: Include the task 'show_interfaces.yml' 41016 1727204182.18434: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204182.18442: getting variables 41016 1727204182.18444: in VariableManager get_vars() 41016 1727204182.18489: Calling all_inventory to load vars for managed-node1 41016 1727204182.18492: Calling groups_inventory to load vars for managed-node1 41016 1727204182.18495: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204182.18507: Calling all_plugins_play to load vars for managed-node1 41016 1727204182.18513: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204182.18516: Calling groups_plugins_play to load vars for managed-node1 41016 1727204182.18985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204182.19295: done with get_vars() 41016 1727204182.19305: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Tuesday 24 September 2024 14:56:22 -0400 (0:00:00.027) 0:00:05.870 ***** 41016 1727204182.19406: entering _queue_task() for managed-node1/include_tasks 41016 1727204182.19811: worker is 1 (out of 1 available) 41016 1727204182.19821: exiting _queue_task() for managed-node1/include_tasks 41016 1727204182.19830: done queuing things up, now waiting for results queue to drain 41016 1727204182.19832: waiting for pending results... 41016 1727204182.20295: running TaskExecutor() for managed-node1/TASK: Include the task 'show_interfaces.yml' 41016 1727204182.20300: in run() - task 028d2410-947f-12d5-0ec4-00000000016c 41016 1727204182.20304: variable 'ansible_search_path' from source: unknown 41016 1727204182.20307: variable 'ansible_search_path' from source: unknown 41016 1727204182.20310: calling self._execute() 41016 1727204182.20320: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204182.20398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204182.20402: variable 'omit' from source: magic vars 41016 1727204182.20892: variable 'ansible_distribution_major_version' from source: facts 41016 1727204182.20933: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204182.20937: _execute() done 41016 1727204182.20939: dumping result to json 41016 1727204182.20941: done dumping result, returning 41016 1727204182.20944: done running TaskExecutor() for managed-node1/TASK: Include the task 'show_interfaces.yml' [028d2410-947f-12d5-0ec4-00000000016c] 41016 1727204182.20946: sending task result for task 028d2410-947f-12d5-0ec4-00000000016c 41016 1727204182.21023: done sending task result for task 028d2410-947f-12d5-0ec4-00000000016c 41016 1727204182.21027: WORKER PROCESS EXITING 41016 1727204182.21060: no more pending results, returning what we have 41016 1727204182.21066: in VariableManager get_vars() 41016 1727204182.21122: Calling all_inventory to load vars for managed-node1 41016 1727204182.21125: Calling groups_inventory to load vars for managed-node1 41016 1727204182.21128: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204182.21141: Calling all_plugins_play to load vars for managed-node1 41016 1727204182.21144: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204182.21148: Calling groups_plugins_play to load vars for managed-node1 41016 1727204182.21542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204182.21846: done with get_vars() 41016 1727204182.21855: variable 'ansible_search_path' from source: unknown 41016 1727204182.21856: variable 'ansible_search_path' from source: unknown 41016 1727204182.21896: we have included files to process 41016 1727204182.21898: generating all_blocks data 41016 1727204182.21899: done generating all_blocks data 41016 1727204182.21905: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 41016 1727204182.21906: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 41016 1727204182.21911: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 41016 1727204182.22084: in VariableManager get_vars() 41016 1727204182.22112: done with get_vars() 41016 1727204182.22434: done processing included file 41016 1727204182.22436: iterating over new_blocks loaded from include file 41016 1727204182.22437: in VariableManager get_vars() 41016 1727204182.22457: done with get_vars() 41016 1727204182.22459: filtering new block on tags 41016 1727204182.22478: done filtering new block on tags 41016 1727204182.22481: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node1 41016 1727204182.22486: extending task lists for all hosts with included blocks 41016 1727204182.23237: done extending task lists 41016 1727204182.23239: done processing included files 41016 1727204182.23239: results queue empty 41016 1727204182.23240: checking for any_errors_fatal 41016 1727204182.23244: done checking for any_errors_fatal 41016 1727204182.23245: checking for max_fail_percentage 41016 1727204182.23246: done checking for max_fail_percentage 41016 1727204182.23247: checking to see if all hosts have failed and the running result is not ok 41016 1727204182.23248: done checking to see if all hosts have failed 41016 1727204182.23248: getting the remaining hosts for this loop 41016 1727204182.23249: done getting the remaining hosts for this loop 41016 1727204182.23252: getting the next task for host managed-node1 41016 1727204182.23256: done getting next task for host managed-node1 41016 1727204182.23258: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 41016 1727204182.23261: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204182.23263: getting variables 41016 1727204182.23264: in VariableManager get_vars() 41016 1727204182.23307: Calling all_inventory to load vars for managed-node1 41016 1727204182.23349: Calling groups_inventory to load vars for managed-node1 41016 1727204182.23352: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204182.23359: Calling all_plugins_play to load vars for managed-node1 41016 1727204182.23362: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204182.23364: Calling groups_plugins_play to load vars for managed-node1 41016 1727204182.23539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204182.23766: done with get_vars() 41016 1727204182.23778: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 14:56:22 -0400 (0:00:00.044) 0:00:05.914 ***** 41016 1727204182.23855: entering _queue_task() for managed-node1/include_tasks 41016 1727204182.24448: worker is 1 (out of 1 available) 41016 1727204182.24458: exiting _queue_task() for managed-node1/include_tasks 41016 1727204182.24471: done queuing things up, now waiting for results queue to drain 41016 1727204182.24472: waiting for pending results... 41016 1727204182.24999: running TaskExecutor() for managed-node1/TASK: Include the task 'get_current_interfaces.yml' 41016 1727204182.25067: in run() - task 028d2410-947f-12d5-0ec4-00000000019d 41016 1727204182.25200: variable 'ansible_search_path' from source: unknown 41016 1727204182.25204: variable 'ansible_search_path' from source: unknown 41016 1727204182.25207: calling self._execute() 41016 1727204182.25214: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204182.25223: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204182.25308: variable 'omit' from source: magic vars 41016 1727204182.25618: variable 'ansible_distribution_major_version' from source: facts 41016 1727204182.25634: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204182.25640: _execute() done 41016 1727204182.25644: dumping result to json 41016 1727204182.25665: done dumping result, returning 41016 1727204182.25669: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_current_interfaces.yml' [028d2410-947f-12d5-0ec4-00000000019d] 41016 1727204182.25671: sending task result for task 028d2410-947f-12d5-0ec4-00000000019d 41016 1727204182.25745: done sending task result for task 028d2410-947f-12d5-0ec4-00000000019d 41016 1727204182.25748: WORKER PROCESS EXITING 41016 1727204182.25778: no more pending results, returning what we have 41016 1727204182.25784: in VariableManager get_vars() 41016 1727204182.25834: Calling all_inventory to load vars for managed-node1 41016 1727204182.25837: Calling groups_inventory to load vars for managed-node1 41016 1727204182.25840: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204182.25853: Calling all_plugins_play to load vars for managed-node1 41016 1727204182.25856: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204182.25859: Calling groups_plugins_play to load vars for managed-node1 41016 1727204182.26258: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204182.26488: done with get_vars() 41016 1727204182.26496: variable 'ansible_search_path' from source: unknown 41016 1727204182.26497: variable 'ansible_search_path' from source: unknown 41016 1727204182.26556: we have included files to process 41016 1727204182.26557: generating all_blocks data 41016 1727204182.26559: done generating all_blocks data 41016 1727204182.26560: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 41016 1727204182.26561: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 41016 1727204182.26563: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 41016 1727204182.26834: done processing included file 41016 1727204182.26837: iterating over new_blocks loaded from include file 41016 1727204182.26838: in VariableManager get_vars() 41016 1727204182.26857: done with get_vars() 41016 1727204182.26858: filtering new block on tags 41016 1727204182.26879: done filtering new block on tags 41016 1727204182.26882: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node1 41016 1727204182.26886: extending task lists for all hosts with included blocks 41016 1727204182.27056: done extending task lists 41016 1727204182.27057: done processing included files 41016 1727204182.27058: results queue empty 41016 1727204182.27059: checking for any_errors_fatal 41016 1727204182.27062: done checking for any_errors_fatal 41016 1727204182.27062: checking for max_fail_percentage 41016 1727204182.27063: done checking for max_fail_percentage 41016 1727204182.27064: checking to see if all hosts have failed and the running result is not ok 41016 1727204182.27065: done checking to see if all hosts have failed 41016 1727204182.27066: getting the remaining hosts for this loop 41016 1727204182.27067: done getting the remaining hosts for this loop 41016 1727204182.27069: getting the next task for host managed-node1 41016 1727204182.27074: done getting next task for host managed-node1 41016 1727204182.27077: ^ task is: TASK: Gather current interface info 41016 1727204182.27081: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204182.27083: getting variables 41016 1727204182.27084: in VariableManager get_vars() 41016 1727204182.27096: Calling all_inventory to load vars for managed-node1 41016 1727204182.27098: Calling groups_inventory to load vars for managed-node1 41016 1727204182.27100: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204182.27110: Calling all_plugins_play to load vars for managed-node1 41016 1727204182.27112: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204182.27115: Calling groups_plugins_play to load vars for managed-node1 41016 1727204182.27253: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204182.27470: done with get_vars() 41016 1727204182.27490: done getting variables 41016 1727204182.27538: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 14:56:22 -0400 (0:00:00.037) 0:00:05.951 ***** 41016 1727204182.27571: entering _queue_task() for managed-node1/command 41016 1727204182.27925: worker is 1 (out of 1 available) 41016 1727204182.27936: exiting _queue_task() for managed-node1/command 41016 1727204182.27948: done queuing things up, now waiting for results queue to drain 41016 1727204182.27950: waiting for pending results... 41016 1727204182.28496: running TaskExecutor() for managed-node1/TASK: Gather current interface info 41016 1727204182.28501: in run() - task 028d2410-947f-12d5-0ec4-0000000001d4 41016 1727204182.28505: variable 'ansible_search_path' from source: unknown 41016 1727204182.28508: variable 'ansible_search_path' from source: unknown 41016 1727204182.28511: calling self._execute() 41016 1727204182.28514: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204182.28517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204182.28520: variable 'omit' from source: magic vars 41016 1727204182.28880: variable 'ansible_distribution_major_version' from source: facts 41016 1727204182.28892: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204182.28898: variable 'omit' from source: magic vars 41016 1727204182.28953: variable 'omit' from source: magic vars 41016 1727204182.28992: variable 'omit' from source: magic vars 41016 1727204182.29031: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204182.29136: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204182.29139: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204182.29142: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204182.29144: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204182.29146: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204182.29151: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204182.29153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204182.29461: Set connection var ansible_shell_executable to /bin/sh 41016 1727204182.29465: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204182.29467: Set connection var ansible_shell_type to sh 41016 1727204182.29470: Set connection var ansible_timeout to 10 41016 1727204182.29472: Set connection var ansible_pipelining to False 41016 1727204182.29474: Set connection var ansible_connection to ssh 41016 1727204182.29478: variable 'ansible_shell_executable' from source: unknown 41016 1727204182.29481: variable 'ansible_connection' from source: unknown 41016 1727204182.29483: variable 'ansible_module_compression' from source: unknown 41016 1727204182.29485: variable 'ansible_shell_type' from source: unknown 41016 1727204182.29487: variable 'ansible_shell_executable' from source: unknown 41016 1727204182.29489: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204182.29491: variable 'ansible_pipelining' from source: unknown 41016 1727204182.29493: variable 'ansible_timeout' from source: unknown 41016 1727204182.29495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204182.29499: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204182.29503: variable 'omit' from source: magic vars 41016 1727204182.29506: starting attempt loop 41016 1727204182.29509: running the handler 41016 1727204182.29519: _low_level_execute_command(): starting 41016 1727204182.29527: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204182.30379: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204182.30427: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204182.30439: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204182.30458: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204182.30570: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204182.32364: stdout chunk (state=3): >>>/root <<< 41016 1727204182.32491: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204182.32526: stderr chunk (state=3): >>><<< 41016 1727204182.32543: stdout chunk (state=3): >>><<< 41016 1727204182.32577: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204182.32596: _low_level_execute_command(): starting 41016 1727204182.32610: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204182.3258393-41653-196272560307631 `" && echo ansible-tmp-1727204182.3258393-41653-196272560307631="` echo /root/.ansible/tmp/ansible-tmp-1727204182.3258393-41653-196272560307631 `" ) && sleep 0' 41016 1727204182.33243: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204182.33257: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204182.33287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204182.33304: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41016 1727204182.33343: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204182.33421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204182.33453: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204182.33569: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204182.35672: stdout chunk (state=3): >>>ansible-tmp-1727204182.3258393-41653-196272560307631=/root/.ansible/tmp/ansible-tmp-1727204182.3258393-41653-196272560307631 <<< 41016 1727204182.35850: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204182.35854: stdout chunk (state=3): >>><<< 41016 1727204182.35856: stderr chunk (state=3): >>><<< 41016 1727204182.35879: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204182.3258393-41653-196272560307631=/root/.ansible/tmp/ansible-tmp-1727204182.3258393-41653-196272560307631 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204182.36019: variable 'ansible_module_compression' from source: unknown 41016 1727204182.36022: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-410168h8uvyln/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41016 1727204182.36025: variable 'ansible_facts' from source: unknown 41016 1727204182.36394: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204182.3258393-41653-196272560307631/AnsiballZ_command.py 41016 1727204182.36517: Sending initial data 41016 1727204182.36527: Sent initial data (156 bytes) 41016 1727204182.37671: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204182.37710: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204182.37817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204182.37889: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204182.38003: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204182.38032: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204182.38194: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204182.39911: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204182.40002: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204182.40106: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmpcwg8fesq /root/.ansible/tmp/ansible-tmp-1727204182.3258393-41653-196272560307631/AnsiballZ_command.py <<< 41016 1727204182.40109: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204182.3258393-41653-196272560307631/AnsiballZ_command.py" <<< 41016 1727204182.40212: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmpcwg8fesq" to remote "/root/.ansible/tmp/ansible-tmp-1727204182.3258393-41653-196272560307631/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204182.3258393-41653-196272560307631/AnsiballZ_command.py" <<< 41016 1727204182.41148: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204182.41159: stdout chunk (state=3): >>><<< 41016 1727204182.41178: stderr chunk (state=3): >>><<< 41016 1727204182.41249: done transferring module to remote 41016 1727204182.41281: _low_level_execute_command(): starting 41016 1727204182.41284: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204182.3258393-41653-196272560307631/ /root/.ansible/tmp/ansible-tmp-1727204182.3258393-41653-196272560307631/AnsiballZ_command.py && sleep 0' 41016 1727204182.41995: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204182.42024: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204182.42048: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204182.42068: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204182.42189: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204182.44209: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204182.44224: stdout chunk (state=3): >>><<< 41016 1727204182.44235: stderr chunk (state=3): >>><<< 41016 1727204182.44254: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204182.44339: _low_level_execute_command(): starting 41016 1727204182.44343: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204182.3258393-41653-196272560307631/AnsiballZ_command.py && sleep 0' 41016 1727204182.44864: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204182.44924: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204182.44956: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204182.45046: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204182.62026: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:56:22.612715", "end": "2024-09-24 14:56:22.616341", "delta": "0:00:00.003626", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41016 1727204182.63941: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 41016 1727204182.63945: stdout chunk (state=3): >>><<< 41016 1727204182.63947: stderr chunk (state=3): >>><<< 41016 1727204182.63966: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:56:22.612715", "end": "2024-09-24 14:56:22.616341", "delta": "0:00:00.003626", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204182.64184: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204182.3258393-41653-196272560307631/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204182.64188: _low_level_execute_command(): starting 41016 1727204182.64190: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204182.3258393-41653-196272560307631/ > /dev/null 2>&1 && sleep 0' 41016 1727204182.65436: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204182.65473: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204182.65493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204182.65616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204182.65711: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204182.65735: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204182.65838: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204182.65899: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204182.68318: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204182.68321: stdout chunk (state=3): >>><<< 41016 1727204182.68323: stderr chunk (state=3): >>><<< 41016 1727204182.68326: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204182.68328: handler run complete 41016 1727204182.68330: Evaluated conditional (False): False 41016 1727204182.68332: attempt loop complete, returning result 41016 1727204182.68334: _execute() done 41016 1727204182.68336: dumping result to json 41016 1727204182.68338: done dumping result, returning 41016 1727204182.68340: done running TaskExecutor() for managed-node1/TASK: Gather current interface info [028d2410-947f-12d5-0ec4-0000000001d4] 41016 1727204182.68342: sending task result for task 028d2410-947f-12d5-0ec4-0000000001d4 41016 1727204182.68421: done sending task result for task 028d2410-947f-12d5-0ec4-0000000001d4 41016 1727204182.68426: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003626", "end": "2024-09-24 14:56:22.616341", "rc": 0, "start": "2024-09-24 14:56:22.612715" } STDOUT: bonding_masters eth0 lo 41016 1727204182.68517: no more pending results, returning what we have 41016 1727204182.68521: results queue empty 41016 1727204182.68522: checking for any_errors_fatal 41016 1727204182.68524: done checking for any_errors_fatal 41016 1727204182.68525: checking for max_fail_percentage 41016 1727204182.68526: done checking for max_fail_percentage 41016 1727204182.68527: checking to see if all hosts have failed and the running result is not ok 41016 1727204182.68528: done checking to see if all hosts have failed 41016 1727204182.68529: getting the remaining hosts for this loop 41016 1727204182.68530: done getting the remaining hosts for this loop 41016 1727204182.68534: getting the next task for host managed-node1 41016 1727204182.68541: done getting next task for host managed-node1 41016 1727204182.68544: ^ task is: TASK: Set current_interfaces 41016 1727204182.68549: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204182.68553: getting variables 41016 1727204182.68555: in VariableManager get_vars() 41016 1727204182.69023: Calling all_inventory to load vars for managed-node1 41016 1727204182.69026: Calling groups_inventory to load vars for managed-node1 41016 1727204182.69029: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204182.69041: Calling all_plugins_play to load vars for managed-node1 41016 1727204182.69044: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204182.69234: Calling groups_plugins_play to load vars for managed-node1 41016 1727204182.69974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204182.70667: done with get_vars() 41016 1727204182.70682: done getting variables 41016 1727204182.70891: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 14:56:22 -0400 (0:00:00.435) 0:00:06.387 ***** 41016 1727204182.71145: entering _queue_task() for managed-node1/set_fact 41016 1727204182.72306: worker is 1 (out of 1 available) 41016 1727204182.72317: exiting _queue_task() for managed-node1/set_fact 41016 1727204182.72328: done queuing things up, now waiting for results queue to drain 41016 1727204182.72329: waiting for pending results... 41016 1727204182.73459: running TaskExecutor() for managed-node1/TASK: Set current_interfaces 41016 1727204182.73464: in run() - task 028d2410-947f-12d5-0ec4-0000000001d5 41016 1727204182.73467: variable 'ansible_search_path' from source: unknown 41016 1727204182.73470: variable 'ansible_search_path' from source: unknown 41016 1727204182.73884: calling self._execute() 41016 1727204182.73888: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204182.73890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204182.73892: variable 'omit' from source: magic vars 41016 1727204182.75204: variable 'ansible_distribution_major_version' from source: facts 41016 1727204182.75285: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204182.75298: variable 'omit' from source: magic vars 41016 1727204182.75433: variable 'omit' from source: magic vars 41016 1727204182.75757: variable '_current_interfaces' from source: set_fact 41016 1727204182.75941: variable 'omit' from source: magic vars 41016 1727204182.75993: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204182.76372: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204182.76377: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204182.76380: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204182.76466: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204182.76616: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204182.76782: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204182.76786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204182.76919: Set connection var ansible_shell_executable to /bin/sh 41016 1727204182.76934: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204182.76946: Set connection var ansible_shell_type to sh 41016 1727204182.76957: Set connection var ansible_timeout to 10 41016 1727204182.76969: Set connection var ansible_pipelining to False 41016 1727204182.76992: Set connection var ansible_connection to ssh 41016 1727204182.77040: variable 'ansible_shell_executable' from source: unknown 41016 1727204182.77051: variable 'ansible_connection' from source: unknown 41016 1727204182.77113: variable 'ansible_module_compression' from source: unknown 41016 1727204182.77235: variable 'ansible_shell_type' from source: unknown 41016 1727204182.77239: variable 'ansible_shell_executable' from source: unknown 41016 1727204182.77241: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204182.77243: variable 'ansible_pipelining' from source: unknown 41016 1727204182.77245: variable 'ansible_timeout' from source: unknown 41016 1727204182.77247: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204182.77649: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204182.77983: variable 'omit' from source: magic vars 41016 1727204182.77986: starting attempt loop 41016 1727204182.77989: running the handler 41016 1727204182.77992: handler run complete 41016 1727204182.77994: attempt loop complete, returning result 41016 1727204182.77997: _execute() done 41016 1727204182.78000: dumping result to json 41016 1727204182.78003: done dumping result, returning 41016 1727204182.78092: done running TaskExecutor() for managed-node1/TASK: Set current_interfaces [028d2410-947f-12d5-0ec4-0000000001d5] 41016 1727204182.78098: sending task result for task 028d2410-947f-12d5-0ec4-0000000001d5 41016 1727204182.78172: done sending task result for task 028d2410-947f-12d5-0ec4-0000000001d5 41016 1727204182.78178: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 41016 1727204182.78245: no more pending results, returning what we have 41016 1727204182.78248: results queue empty 41016 1727204182.78249: checking for any_errors_fatal 41016 1727204182.78259: done checking for any_errors_fatal 41016 1727204182.78260: checking for max_fail_percentage 41016 1727204182.78261: done checking for max_fail_percentage 41016 1727204182.78262: checking to see if all hosts have failed and the running result is not ok 41016 1727204182.78263: done checking to see if all hosts have failed 41016 1727204182.78264: getting the remaining hosts for this loop 41016 1727204182.78266: done getting the remaining hosts for this loop 41016 1727204182.78270: getting the next task for host managed-node1 41016 1727204182.78360: done getting next task for host managed-node1 41016 1727204182.78363: ^ task is: TASK: Show current_interfaces 41016 1727204182.78367: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204182.78371: getting variables 41016 1727204182.78373: in VariableManager get_vars() 41016 1727204182.78487: Calling all_inventory to load vars for managed-node1 41016 1727204182.78491: Calling groups_inventory to load vars for managed-node1 41016 1727204182.78534: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204182.78544: Calling all_plugins_play to load vars for managed-node1 41016 1727204182.78547: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204182.78550: Calling groups_plugins_play to load vars for managed-node1 41016 1727204182.79001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204182.79636: done with get_vars() 41016 1727204182.79649: done getting variables 41016 1727204182.79843: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 14:56:22 -0400 (0:00:00.087) 0:00:06.474 ***** 41016 1727204182.79900: entering _queue_task() for managed-node1/debug 41016 1727204182.80709: worker is 1 (out of 1 available) 41016 1727204182.80723: exiting _queue_task() for managed-node1/debug 41016 1727204182.80734: done queuing things up, now waiting for results queue to drain 41016 1727204182.80735: waiting for pending results... 41016 1727204182.81214: running TaskExecutor() for managed-node1/TASK: Show current_interfaces 41016 1727204182.81583: in run() - task 028d2410-947f-12d5-0ec4-00000000019e 41016 1727204182.81588: variable 'ansible_search_path' from source: unknown 41016 1727204182.81592: variable 'ansible_search_path' from source: unknown 41016 1727204182.81595: calling self._execute() 41016 1727204182.81788: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204182.81821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204182.82000: variable 'omit' from source: magic vars 41016 1727204182.82606: variable 'ansible_distribution_major_version' from source: facts 41016 1727204182.82628: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204182.82642: variable 'omit' from source: magic vars 41016 1727204182.82699: variable 'omit' from source: magic vars 41016 1727204182.82851: variable 'current_interfaces' from source: set_fact 41016 1727204182.82893: variable 'omit' from source: magic vars 41016 1727204182.82966: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204182.83015: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204182.83041: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204182.83111: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204182.83252: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204182.83255: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204182.83257: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204182.83259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204182.83414: Set connection var ansible_shell_executable to /bin/sh 41016 1727204182.83474: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204182.83681: Set connection var ansible_shell_type to sh 41016 1727204182.83684: Set connection var ansible_timeout to 10 41016 1727204182.83687: Set connection var ansible_pipelining to False 41016 1727204182.83689: Set connection var ansible_connection to ssh 41016 1727204182.83691: variable 'ansible_shell_executable' from source: unknown 41016 1727204182.83693: variable 'ansible_connection' from source: unknown 41016 1727204182.83695: variable 'ansible_module_compression' from source: unknown 41016 1727204182.83697: variable 'ansible_shell_type' from source: unknown 41016 1727204182.83699: variable 'ansible_shell_executable' from source: unknown 41016 1727204182.83700: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204182.83702: variable 'ansible_pipelining' from source: unknown 41016 1727204182.83704: variable 'ansible_timeout' from source: unknown 41016 1727204182.83706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204182.84382: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204182.84386: variable 'omit' from source: magic vars 41016 1727204182.84388: starting attempt loop 41016 1727204182.84390: running the handler 41016 1727204182.84392: handler run complete 41016 1727204182.84394: attempt loop complete, returning result 41016 1727204182.84395: _execute() done 41016 1727204182.84397: dumping result to json 41016 1727204182.84399: done dumping result, returning 41016 1727204182.84401: done running TaskExecutor() for managed-node1/TASK: Show current_interfaces [028d2410-947f-12d5-0ec4-00000000019e] 41016 1727204182.84403: sending task result for task 028d2410-947f-12d5-0ec4-00000000019e 41016 1727204182.84461: done sending task result for task 028d2410-947f-12d5-0ec4-00000000019e 41016 1727204182.84463: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 41016 1727204182.84512: no more pending results, returning what we have 41016 1727204182.84515: results queue empty 41016 1727204182.84516: checking for any_errors_fatal 41016 1727204182.84522: done checking for any_errors_fatal 41016 1727204182.84523: checking for max_fail_percentage 41016 1727204182.84524: done checking for max_fail_percentage 41016 1727204182.84525: checking to see if all hosts have failed and the running result is not ok 41016 1727204182.84526: done checking to see if all hosts have failed 41016 1727204182.84526: getting the remaining hosts for this loop 41016 1727204182.84527: done getting the remaining hosts for this loop 41016 1727204182.84531: getting the next task for host managed-node1 41016 1727204182.84538: done getting next task for host managed-node1 41016 1727204182.84541: ^ task is: TASK: Install iproute 41016 1727204182.84545: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204182.84549: getting variables 41016 1727204182.84550: in VariableManager get_vars() 41016 1727204182.84592: Calling all_inventory to load vars for managed-node1 41016 1727204182.84595: Calling groups_inventory to load vars for managed-node1 41016 1727204182.84598: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204182.84608: Calling all_plugins_play to load vars for managed-node1 41016 1727204182.84611: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204182.84614: Calling groups_plugins_play to load vars for managed-node1 41016 1727204182.84998: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204182.85578: done with get_vars() 41016 1727204182.85588: done getting variables 41016 1727204182.85640: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Tuesday 24 September 2024 14:56:22 -0400 (0:00:00.058) 0:00:06.533 ***** 41016 1727204182.85747: entering _queue_task() for managed-node1/package 41016 1727204182.86049: worker is 1 (out of 1 available) 41016 1727204182.86061: exiting _queue_task() for managed-node1/package 41016 1727204182.86073: done queuing things up, now waiting for results queue to drain 41016 1727204182.86077: waiting for pending results... 41016 1727204182.86432: running TaskExecutor() for managed-node1/TASK: Install iproute 41016 1727204182.86479: in run() - task 028d2410-947f-12d5-0ec4-00000000016d 41016 1727204182.86528: variable 'ansible_search_path' from source: unknown 41016 1727204182.86531: variable 'ansible_search_path' from source: unknown 41016 1727204182.86553: calling self._execute() 41016 1727204182.86646: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204182.86660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204182.86671: variable 'omit' from source: magic vars 41016 1727204182.87073: variable 'ansible_distribution_major_version' from source: facts 41016 1727204182.87090: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204182.87183: variable 'omit' from source: magic vars 41016 1727204182.87186: variable 'omit' from source: magic vars 41016 1727204182.87360: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41016 1727204182.90094: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41016 1727204182.90142: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41016 1727204182.90169: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41016 1727204182.90196: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41016 1727204182.90230: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41016 1727204182.90303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204182.90328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204182.90347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204182.90372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204182.90385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204182.90464: variable '__network_is_ostree' from source: set_fact 41016 1727204182.90467: variable 'omit' from source: magic vars 41016 1727204182.90493: variable 'omit' from source: magic vars 41016 1727204182.90516: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204182.90549: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204182.90567: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204182.90582: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204182.90591: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204182.90614: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204182.90617: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204182.90626: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204182.90745: Set connection var ansible_shell_executable to /bin/sh 41016 1727204182.90748: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204182.90750: Set connection var ansible_shell_type to sh 41016 1727204182.90752: Set connection var ansible_timeout to 10 41016 1727204182.90756: Set connection var ansible_pipelining to False 41016 1727204182.90758: Set connection var ansible_connection to ssh 41016 1727204182.90867: variable 'ansible_shell_executable' from source: unknown 41016 1727204182.90870: variable 'ansible_connection' from source: unknown 41016 1727204182.90873: variable 'ansible_module_compression' from source: unknown 41016 1727204182.90877: variable 'ansible_shell_type' from source: unknown 41016 1727204182.90879: variable 'ansible_shell_executable' from source: unknown 41016 1727204182.90881: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204182.90883: variable 'ansible_pipelining' from source: unknown 41016 1727204182.90886: variable 'ansible_timeout' from source: unknown 41016 1727204182.90888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204182.90891: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204182.90893: variable 'omit' from source: magic vars 41016 1727204182.90895: starting attempt loop 41016 1727204182.90897: running the handler 41016 1727204182.90917: variable 'ansible_facts' from source: unknown 41016 1727204182.90920: variable 'ansible_facts' from source: unknown 41016 1727204182.91082: _low_level_execute_command(): starting 41016 1727204182.91085: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204182.91553: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204182.91614: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204182.91713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204182.91717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204182.91754: stderr chunk (state=3): >>>debug2: match not found <<< 41016 1727204182.91757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 41016 1727204182.91760: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204182.91762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204182.91798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 41016 1727204182.91802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204182.91888: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204182.91895: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204182.91925: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204182.92016: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204182.93788: stdout chunk (state=3): >>>/root <<< 41016 1727204182.93893: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204182.93925: stderr chunk (state=3): >>><<< 41016 1727204182.93928: stdout chunk (state=3): >>><<< 41016 1727204182.93950: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204182.93961: _low_level_execute_command(): starting 41016 1727204182.93967: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204182.939494-41762-41728092336206 `" && echo ansible-tmp-1727204182.939494-41762-41728092336206="` echo /root/.ansible/tmp/ansible-tmp-1727204182.939494-41762-41728092336206 `" ) && sleep 0' 41016 1727204182.94684: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204182.94687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204182.94833: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204182.94885: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204182.97346: stdout chunk (state=3): >>>ansible-tmp-1727204182.939494-41762-41728092336206=/root/.ansible/tmp/ansible-tmp-1727204182.939494-41762-41728092336206 <<< 41016 1727204182.97517: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204182.97521: stdout chunk (state=3): >>><<< 41016 1727204182.97523: stderr chunk (state=3): >>><<< 41016 1727204182.97544: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204182.939494-41762-41728092336206=/root/.ansible/tmp/ansible-tmp-1727204182.939494-41762-41728092336206 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204182.97682: variable 'ansible_module_compression' from source: unknown 41016 1727204182.97687: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 41016 1727204182.97689: ANSIBALLZ: Acquiring lock 41016 1727204182.97691: ANSIBALLZ: Lock acquired: 140580610774160 41016 1727204182.97693: ANSIBALLZ: Creating module 41016 1727204183.21350: ANSIBALLZ: Writing module into payload 41016 1727204183.21570: ANSIBALLZ: Writing module 41016 1727204183.21593: ANSIBALLZ: Renaming module 41016 1727204183.21606: ANSIBALLZ: Done creating module 41016 1727204183.21625: variable 'ansible_facts' from source: unknown 41016 1727204183.21733: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204182.939494-41762-41728092336206/AnsiballZ_dnf.py 41016 1727204183.21919: Sending initial data 41016 1727204183.21922: Sent initial data (150 bytes) 41016 1727204183.22814: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204183.22818: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204183.22820: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204183.22844: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204183.22952: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204183.24697: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204183.24889: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204183.24981: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmpgu_xs86l /root/.ansible/tmp/ansible-tmp-1727204182.939494-41762-41728092336206/AnsiballZ_dnf.py <<< 41016 1727204183.24984: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204182.939494-41762-41728092336206/AnsiballZ_dnf.py" <<< 41016 1727204183.25052: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmpgu_xs86l" to remote "/root/.ansible/tmp/ansible-tmp-1727204182.939494-41762-41728092336206/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204182.939494-41762-41728092336206/AnsiballZ_dnf.py" <<< 41016 1727204183.26762: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204183.26815: stderr chunk (state=3): >>><<< 41016 1727204183.26818: stdout chunk (state=3): >>><<< 41016 1727204183.26884: done transferring module to remote 41016 1727204183.26912: _low_level_execute_command(): starting 41016 1727204183.26915: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204182.939494-41762-41728092336206/ /root/.ansible/tmp/ansible-tmp-1727204182.939494-41762-41728092336206/AnsiballZ_dnf.py && sleep 0' 41016 1727204183.27624: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204183.27640: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204183.27750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204183.27824: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204183.27931: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204183.30086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204183.30089: stdout chunk (state=3): >>><<< 41016 1727204183.30092: stderr chunk (state=3): >>><<< 41016 1727204183.30094: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204183.30096: _low_level_execute_command(): starting 41016 1727204183.30098: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204182.939494-41762-41728092336206/AnsiballZ_dnf.py && sleep 0' 41016 1727204183.31653: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204183.31706: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204183.31856: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204183.77561: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 41016 1727204183.82995: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204183.83048: stderr chunk (state=3): >>>Shared connection to 10.31.14.47 closed. <<< 41016 1727204183.83052: stdout chunk (state=3): >>><<< 41016 1727204183.83066: stderr chunk (state=3): >>><<< 41016 1727204183.83088: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204183.83282: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204182.939494-41762-41728092336206/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204183.83286: _low_level_execute_command(): starting 41016 1727204183.83289: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204182.939494-41762-41728092336206/ > /dev/null 2>&1 && sleep 0' 41016 1727204183.83785: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204183.83793: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204183.83806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204183.83829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204183.83888: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204183.83891: stderr chunk (state=3): >>>debug2: match not found <<< 41016 1727204183.83894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204183.83896: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41016 1727204183.83898: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 41016 1727204183.83960: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204183.83982: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204183.83995: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204183.84018: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204183.84167: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204183.86106: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204183.86130: stderr chunk (state=3): >>><<< 41016 1727204183.86133: stdout chunk (state=3): >>><<< 41016 1727204183.86149: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204183.86155: handler run complete 41016 1727204183.86283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41016 1727204183.86413: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41016 1727204183.86445: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41016 1727204183.86471: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41016 1727204183.86492: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41016 1727204183.86546: variable '__install_status' from source: unknown 41016 1727204183.86561: Evaluated conditional (__install_status is success): True 41016 1727204183.86580: attempt loop complete, returning result 41016 1727204183.86583: _execute() done 41016 1727204183.86585: dumping result to json 41016 1727204183.86587: done dumping result, returning 41016 1727204183.86593: done running TaskExecutor() for managed-node1/TASK: Install iproute [028d2410-947f-12d5-0ec4-00000000016d] 41016 1727204183.86596: sending task result for task 028d2410-947f-12d5-0ec4-00000000016d 41016 1727204183.86690: done sending task result for task 028d2410-947f-12d5-0ec4-00000000016d 41016 1727204183.86693: WORKER PROCESS EXITING ok: [managed-node1] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 41016 1727204183.86772: no more pending results, returning what we have 41016 1727204183.86777: results queue empty 41016 1727204183.86778: checking for any_errors_fatal 41016 1727204183.86782: done checking for any_errors_fatal 41016 1727204183.86783: checking for max_fail_percentage 41016 1727204183.86784: done checking for max_fail_percentage 41016 1727204183.86785: checking to see if all hosts have failed and the running result is not ok 41016 1727204183.86786: done checking to see if all hosts have failed 41016 1727204183.86787: getting the remaining hosts for this loop 41016 1727204183.86788: done getting the remaining hosts for this loop 41016 1727204183.86791: getting the next task for host managed-node1 41016 1727204183.86797: done getting next task for host managed-node1 41016 1727204183.86799: ^ task is: TASK: Create veth interface {{ interface }} 41016 1727204183.86802: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204183.86806: getting variables 41016 1727204183.86808: in VariableManager get_vars() 41016 1727204183.86847: Calling all_inventory to load vars for managed-node1 41016 1727204183.86850: Calling groups_inventory to load vars for managed-node1 41016 1727204183.86852: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204183.86861: Calling all_plugins_play to load vars for managed-node1 41016 1727204183.86864: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204183.86866: Calling groups_plugins_play to load vars for managed-node1 41016 1727204183.87113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204183.87242: done with get_vars() 41016 1727204183.87250: done getting variables 41016 1727204183.87292: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41016 1727204183.87393: variable 'interface' from source: set_fact TASK [Create veth interface ethtest0] ****************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Tuesday 24 September 2024 14:56:23 -0400 (0:00:01.016) 0:00:07.550 ***** 41016 1727204183.87446: entering _queue_task() for managed-node1/command 41016 1727204183.87660: worker is 1 (out of 1 available) 41016 1727204183.87673: exiting _queue_task() for managed-node1/command 41016 1727204183.87686: done queuing things up, now waiting for results queue to drain 41016 1727204183.87687: waiting for pending results... 41016 1727204183.87868: running TaskExecutor() for managed-node1/TASK: Create veth interface ethtest0 41016 1727204183.87955: in run() - task 028d2410-947f-12d5-0ec4-00000000016e 41016 1727204183.87965: variable 'ansible_search_path' from source: unknown 41016 1727204183.87968: variable 'ansible_search_path' from source: unknown 41016 1727204183.88172: variable 'interface' from source: set_fact 41016 1727204183.88239: variable 'interface' from source: set_fact 41016 1727204183.88285: variable 'interface' from source: set_fact 41016 1727204183.88394: Loaded config def from plugin (lookup/items) 41016 1727204183.88398: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 41016 1727204183.88420: variable 'omit' from source: magic vars 41016 1727204183.88503: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204183.88511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204183.88523: variable 'omit' from source: magic vars 41016 1727204183.88686: variable 'ansible_distribution_major_version' from source: facts 41016 1727204183.88693: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204183.88824: variable 'type' from source: set_fact 41016 1727204183.88827: variable 'state' from source: include params 41016 1727204183.88830: variable 'interface' from source: set_fact 41016 1727204183.88834: variable 'current_interfaces' from source: set_fact 41016 1727204183.88847: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 41016 1727204183.88849: variable 'omit' from source: magic vars 41016 1727204183.88869: variable 'omit' from source: magic vars 41016 1727204183.88904: variable 'item' from source: unknown 41016 1727204183.88955: variable 'item' from source: unknown 41016 1727204183.88966: variable 'omit' from source: magic vars 41016 1727204183.88990: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204183.89015: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204183.89029: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204183.89041: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204183.89050: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204183.89076: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204183.89079: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204183.89082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204183.89152: Set connection var ansible_shell_executable to /bin/sh 41016 1727204183.89155: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204183.89161: Set connection var ansible_shell_type to sh 41016 1727204183.89168: Set connection var ansible_timeout to 10 41016 1727204183.89171: Set connection var ansible_pipelining to False 41016 1727204183.89181: Set connection var ansible_connection to ssh 41016 1727204183.89194: variable 'ansible_shell_executable' from source: unknown 41016 1727204183.89197: variable 'ansible_connection' from source: unknown 41016 1727204183.89199: variable 'ansible_module_compression' from source: unknown 41016 1727204183.89201: variable 'ansible_shell_type' from source: unknown 41016 1727204183.89204: variable 'ansible_shell_executable' from source: unknown 41016 1727204183.89206: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204183.89211: variable 'ansible_pipelining' from source: unknown 41016 1727204183.89218: variable 'ansible_timeout' from source: unknown 41016 1727204183.89220: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204183.89317: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204183.89330: variable 'omit' from source: magic vars 41016 1727204183.89333: starting attempt loop 41016 1727204183.89336: running the handler 41016 1727204183.89347: _low_level_execute_command(): starting 41016 1727204183.89354: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204183.90011: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204183.90016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204183.90020: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204183.90034: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204183.90132: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204183.91925: stdout chunk (state=3): >>>/root <<< 41016 1727204183.92021: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204183.92053: stderr chunk (state=3): >>><<< 41016 1727204183.92056: stdout chunk (state=3): >>><<< 41016 1727204183.92085: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204183.92098: _low_level_execute_command(): starting 41016 1727204183.92104: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204183.9208434-41889-224825682081964 `" && echo ansible-tmp-1727204183.9208434-41889-224825682081964="` echo /root/.ansible/tmp/ansible-tmp-1727204183.9208434-41889-224825682081964 `" ) && sleep 0' 41016 1727204183.92682: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204183.92686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204183.92688: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204183.92690: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204183.92692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 41016 1727204183.92694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204183.92745: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204183.92752: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204183.92832: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204183.94947: stdout chunk (state=3): >>>ansible-tmp-1727204183.9208434-41889-224825682081964=/root/.ansible/tmp/ansible-tmp-1727204183.9208434-41889-224825682081964 <<< 41016 1727204183.95183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204183.95186: stdout chunk (state=3): >>><<< 41016 1727204183.95188: stderr chunk (state=3): >>><<< 41016 1727204183.95294: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204183.9208434-41889-224825682081964=/root/.ansible/tmp/ansible-tmp-1727204183.9208434-41889-224825682081964 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204183.95298: variable 'ansible_module_compression' from source: unknown 41016 1727204183.95537: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-410168h8uvyln/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41016 1727204183.95540: variable 'ansible_facts' from source: unknown 41016 1727204183.95625: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204183.9208434-41889-224825682081964/AnsiballZ_command.py 41016 1727204183.95794: Sending initial data 41016 1727204183.95798: Sent initial data (156 bytes) 41016 1727204183.96692: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204183.96771: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204183.96816: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204183.96837: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204183.97054: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204183.98708: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204183.98817: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204183.98916: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmpepe41qvx /root/.ansible/tmp/ansible-tmp-1727204183.9208434-41889-224825682081964/AnsiballZ_command.py <<< 41016 1727204183.98919: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204183.9208434-41889-224825682081964/AnsiballZ_command.py" <<< 41016 1727204183.98984: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmpepe41qvx" to remote "/root/.ansible/tmp/ansible-tmp-1727204183.9208434-41889-224825682081964/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204183.9208434-41889-224825682081964/AnsiballZ_command.py" <<< 41016 1727204183.99912: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204183.99923: stderr chunk (state=3): >>><<< 41016 1727204183.99930: stdout chunk (state=3): >>><<< 41016 1727204184.00058: done transferring module to remote 41016 1727204184.00061: _low_level_execute_command(): starting 41016 1727204184.00063: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204183.9208434-41889-224825682081964/ /root/.ansible/tmp/ansible-tmp-1727204183.9208434-41889-224825682081964/AnsiballZ_command.py && sleep 0' 41016 1727204184.01292: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204184.01388: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204184.01503: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204184.03520: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204184.03524: stdout chunk (state=3): >>><<< 41016 1727204184.03536: stderr chunk (state=3): >>><<< 41016 1727204184.03693: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204184.03697: _low_level_execute_command(): starting 41016 1727204184.03702: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204183.9208434-41889-224825682081964/AnsiballZ_command.py && sleep 0' 41016 1727204184.05018: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204184.05167: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204184.05289: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204184.22648: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-24 14:56:24.214833", "end": "2024-09-24 14:56:24.220205", "delta": "0:00:00.005372", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41016 1727204184.25750: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 41016 1727204184.25779: stdout chunk (state=3): >>><<< 41016 1727204184.26244: stderr chunk (state=3): >>><<< 41016 1727204184.26248: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-24 14:56:24.214833", "end": "2024-09-24 14:56:24.220205", "delta": "0:00:00.005372", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204184.26251: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add ethtest0 type veth peer name peerethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204183.9208434-41889-224825682081964/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204184.26253: _low_level_execute_command(): starting 41016 1727204184.26254: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204183.9208434-41889-224825682081964/ > /dev/null 2>&1 && sleep 0' 41016 1727204184.27419: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204184.27679: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204184.27802: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204184.27920: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204184.31228: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204184.31253: stderr chunk (state=3): >>><<< 41016 1727204184.31336: stdout chunk (state=3): >>><<< 41016 1727204184.31348: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204184.31355: handler run complete 41016 1727204184.31444: Evaluated conditional (False): False 41016 1727204184.31447: attempt loop complete, returning result 41016 1727204184.31450: variable 'item' from source: unknown 41016 1727204184.31889: variable 'item' from source: unknown ok: [managed-node1] => (item=ip link add ethtest0 type veth peer name peerethtest0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0" ], "delta": "0:00:00.005372", "end": "2024-09-24 14:56:24.220205", "item": "ip link add ethtest0 type veth peer name peerethtest0", "rc": 0, "start": "2024-09-24 14:56:24.214833" } 41016 1727204184.32202: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204184.32206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204184.32221: variable 'omit' from source: magic vars 41016 1727204184.32573: variable 'ansible_distribution_major_version' from source: facts 41016 1727204184.32636: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204184.33314: variable 'type' from source: set_fact 41016 1727204184.33318: variable 'state' from source: include params 41016 1727204184.33320: variable 'interface' from source: set_fact 41016 1727204184.33344: variable 'current_interfaces' from source: set_fact 41016 1727204184.33347: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 41016 1727204184.33350: variable 'omit' from source: magic vars 41016 1727204184.33352: variable 'omit' from source: magic vars 41016 1727204184.33441: variable 'item' from source: unknown 41016 1727204184.33619: variable 'item' from source: unknown 41016 1727204184.33647: variable 'omit' from source: magic vars 41016 1727204184.33674: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204184.33684: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204184.33692: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204184.33769: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204184.33773: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204184.33776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204184.34486: Set connection var ansible_shell_executable to /bin/sh 41016 1727204184.34489: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204184.34491: Set connection var ansible_shell_type to sh 41016 1727204184.34493: Set connection var ansible_timeout to 10 41016 1727204184.34494: Set connection var ansible_pipelining to False 41016 1727204184.34496: Set connection var ansible_connection to ssh 41016 1727204184.34498: variable 'ansible_shell_executable' from source: unknown 41016 1727204184.34499: variable 'ansible_connection' from source: unknown 41016 1727204184.34501: variable 'ansible_module_compression' from source: unknown 41016 1727204184.34503: variable 'ansible_shell_type' from source: unknown 41016 1727204184.34504: variable 'ansible_shell_executable' from source: unknown 41016 1727204184.34506: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204184.34507: variable 'ansible_pipelining' from source: unknown 41016 1727204184.34511: variable 'ansible_timeout' from source: unknown 41016 1727204184.34513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204184.34515: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204184.34517: variable 'omit' from source: magic vars 41016 1727204184.34519: starting attempt loop 41016 1727204184.34521: running the handler 41016 1727204184.34523: _low_level_execute_command(): starting 41016 1727204184.34524: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204184.35318: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204184.35322: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204184.35325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204184.35328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204184.35331: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204184.35334: stderr chunk (state=3): >>>debug2: match not found <<< 41016 1727204184.35337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204184.35339: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41016 1727204184.35342: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 41016 1727204184.35424: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204184.35434: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204184.35448: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204184.35843: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204184.37523: stdout chunk (state=3): >>>/root <<< 41016 1727204184.37567: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204184.37570: stdout chunk (state=3): >>><<< 41016 1727204184.37579: stderr chunk (state=3): >>><<< 41016 1727204184.37642: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204184.37651: _low_level_execute_command(): starting 41016 1727204184.37656: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204184.3764129-41889-151092582282278 `" && echo ansible-tmp-1727204184.3764129-41889-151092582282278="` echo /root/.ansible/tmp/ansible-tmp-1727204184.3764129-41889-151092582282278 `" ) && sleep 0' 41016 1727204184.38824: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204184.39059: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204184.39063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204184.39065: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204184.39068: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204184.39070: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204184.39201: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204184.39395: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204184.41419: stdout chunk (state=3): >>>ansible-tmp-1727204184.3764129-41889-151092582282278=/root/.ansible/tmp/ansible-tmp-1727204184.3764129-41889-151092582282278 <<< 41016 1727204184.41783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204184.41787: stderr chunk (state=3): >>><<< 41016 1727204184.41789: stdout chunk (state=3): >>><<< 41016 1727204184.41791: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204184.3764129-41889-151092582282278=/root/.ansible/tmp/ansible-tmp-1727204184.3764129-41889-151092582282278 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204184.41793: variable 'ansible_module_compression' from source: unknown 41016 1727204184.41795: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-410168h8uvyln/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41016 1727204184.41797: variable 'ansible_facts' from source: unknown 41016 1727204184.41843: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204184.3764129-41889-151092582282278/AnsiballZ_command.py 41016 1727204184.42540: Sending initial data 41016 1727204184.42543: Sent initial data (156 bytes) 41016 1727204184.43917: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204184.43921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204184.44019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204184.44208: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204184.44248: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204184.44312: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204184.44391: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204184.46134: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204184.46217: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204184.46292: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmpf_5xfo3i /root/.ansible/tmp/ansible-tmp-1727204184.3764129-41889-151092582282278/AnsiballZ_command.py <<< 41016 1727204184.46296: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204184.3764129-41889-151092582282278/AnsiballZ_command.py" <<< 41016 1727204184.46467: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmpf_5xfo3i" to remote "/root/.ansible/tmp/ansible-tmp-1727204184.3764129-41889-151092582282278/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204184.3764129-41889-151092582282278/AnsiballZ_command.py" <<< 41016 1727204184.48024: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204184.48050: stderr chunk (state=3): >>><<< 41016 1727204184.48053: stdout chunk (state=3): >>><<< 41016 1727204184.48116: done transferring module to remote 41016 1727204184.48124: _low_level_execute_command(): starting 41016 1727204184.48202: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204184.3764129-41889-151092582282278/ /root/.ansible/tmp/ansible-tmp-1727204184.3764129-41889-151092582282278/AnsiballZ_command.py && sleep 0' 41016 1727204184.49703: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204184.49707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 41016 1727204184.49712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 41016 1727204184.49714: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204184.49716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204184.49777: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204184.49837: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204184.51956: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204184.51966: stdout chunk (state=3): >>><<< 41016 1727204184.51979: stderr chunk (state=3): >>><<< 41016 1727204184.52067: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204184.52079: _low_level_execute_command(): starting 41016 1727204184.52089: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204184.3764129-41889-151092582282278/AnsiballZ_command.py && sleep 0' 41016 1727204184.53644: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204184.53665: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204184.53683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204184.53787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204184.54001: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204184.54120: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204184.71349: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-24 14:56:24.707716", "end": "2024-09-24 14:56:24.711676", "delta": "0:00:00.003960", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41016 1727204184.73212: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 41016 1727204184.73224: stdout chunk (state=3): >>><<< 41016 1727204184.73265: stderr chunk (state=3): >>><<< 41016 1727204184.73291: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-24 14:56:24.707716", "end": "2024-09-24 14:56:24.711676", "delta": "0:00:00.003960", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204184.73332: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204184.3764129-41889-151092582282278/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204184.73346: _low_level_execute_command(): starting 41016 1727204184.73368: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204184.3764129-41889-151092582282278/ > /dev/null 2>&1 && sleep 0' 41016 1727204184.74405: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204184.74408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 41016 1727204184.74410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 41016 1727204184.74412: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204184.74414: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204184.74463: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204184.74477: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204184.74579: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204184.76681: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204184.76685: stdout chunk (state=3): >>><<< 41016 1727204184.76687: stderr chunk (state=3): >>><<< 41016 1727204184.76690: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204184.76692: handler run complete 41016 1727204184.76694: Evaluated conditional (False): False 41016 1727204184.76699: attempt loop complete, returning result 41016 1727204184.76724: variable 'item' from source: unknown 41016 1727204184.76828: variable 'item' from source: unknown ok: [managed-node1] => (item=ip link set peerethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerethtest0", "up" ], "delta": "0:00:00.003960", "end": "2024-09-24 14:56:24.711676", "item": "ip link set peerethtest0 up", "rc": 0, "start": "2024-09-24 14:56:24.707716" } 41016 1727204184.76980: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204184.76984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204184.76987: variable 'omit' from source: magic vars 41016 1727204184.77381: variable 'ansible_distribution_major_version' from source: facts 41016 1727204184.77385: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204184.77411: variable 'type' from source: set_fact 41016 1727204184.77419: variable 'state' from source: include params 41016 1727204184.77427: variable 'interface' from source: set_fact 41016 1727204184.77432: variable 'current_interfaces' from source: set_fact 41016 1727204184.77438: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 41016 1727204184.77442: variable 'omit' from source: magic vars 41016 1727204184.77458: variable 'omit' from source: magic vars 41016 1727204184.77499: variable 'item' from source: unknown 41016 1727204184.77573: variable 'item' from source: unknown 41016 1727204184.77589: variable 'omit' from source: magic vars 41016 1727204184.77610: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204184.77624: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204184.77630: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204184.77644: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204184.77647: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204184.77650: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204184.77748: Set connection var ansible_shell_executable to /bin/sh 41016 1727204184.77753: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204184.77758: Set connection var ansible_shell_type to sh 41016 1727204184.77764: Set connection var ansible_timeout to 10 41016 1727204184.77769: Set connection var ansible_pipelining to False 41016 1727204184.77778: Set connection var ansible_connection to ssh 41016 1727204184.77809: variable 'ansible_shell_executable' from source: unknown 41016 1727204184.77815: variable 'ansible_connection' from source: unknown 41016 1727204184.77818: variable 'ansible_module_compression' from source: unknown 41016 1727204184.77820: variable 'ansible_shell_type' from source: unknown 41016 1727204184.77823: variable 'ansible_shell_executable' from source: unknown 41016 1727204184.77827: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204184.77831: variable 'ansible_pipelining' from source: unknown 41016 1727204184.77833: variable 'ansible_timeout' from source: unknown 41016 1727204184.77838: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204184.78051: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204184.78059: variable 'omit' from source: magic vars 41016 1727204184.78061: starting attempt loop 41016 1727204184.78064: running the handler 41016 1727204184.78066: _low_level_execute_command(): starting 41016 1727204184.78068: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204184.78606: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204184.78622: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204184.78638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204184.78654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204184.78671: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204184.78767: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204184.78791: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204184.78811: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204184.78927: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204184.80832: stdout chunk (state=3): >>>/root <<< 41016 1727204184.80862: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204184.80900: stderr chunk (state=3): >>><<< 41016 1727204184.80911: stdout chunk (state=3): >>><<< 41016 1727204184.80934: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204184.80948: _low_level_execute_command(): starting 41016 1727204184.80957: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204184.8093998-41889-268195268379579 `" && echo ansible-tmp-1727204184.8093998-41889-268195268379579="` echo /root/.ansible/tmp/ansible-tmp-1727204184.8093998-41889-268195268379579 `" ) && sleep 0' 41016 1727204184.81643: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204184.81669: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204184.81687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204184.81779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204184.81820: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204184.81843: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204184.81861: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204184.82015: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204184.84125: stdout chunk (state=3): >>>ansible-tmp-1727204184.8093998-41889-268195268379579=/root/.ansible/tmp/ansible-tmp-1727204184.8093998-41889-268195268379579 <<< 41016 1727204184.84299: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204184.84303: stdout chunk (state=3): >>><<< 41016 1727204184.84306: stderr chunk (state=3): >>><<< 41016 1727204184.84692: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204184.8093998-41889-268195268379579=/root/.ansible/tmp/ansible-tmp-1727204184.8093998-41889-268195268379579 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204184.84696: variable 'ansible_module_compression' from source: unknown 41016 1727204184.84699: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-410168h8uvyln/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41016 1727204184.84702: variable 'ansible_facts' from source: unknown 41016 1727204184.84704: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204184.8093998-41889-268195268379579/AnsiballZ_command.py 41016 1727204184.85098: Sending initial data 41016 1727204184.85111: Sent initial data (156 bytes) 41016 1727204184.85980: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204184.85996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 41016 1727204184.86025: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204184.86146: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204184.86397: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204184.86501: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204184.88252: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204184.88346: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204184.88458: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmpe99we_jt /root/.ansible/tmp/ansible-tmp-1727204184.8093998-41889-268195268379579/AnsiballZ_command.py <<< 41016 1727204184.88469: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204184.8093998-41889-268195268379579/AnsiballZ_command.py" <<< 41016 1727204184.88538: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmpe99we_jt" to remote "/root/.ansible/tmp/ansible-tmp-1727204184.8093998-41889-268195268379579/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204184.8093998-41889-268195268379579/AnsiballZ_command.py" <<< 41016 1727204184.89570: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204184.89612: stderr chunk (state=3): >>><<< 41016 1727204184.89751: stdout chunk (state=3): >>><<< 41016 1727204184.89754: done transferring module to remote 41016 1727204184.89757: _low_level_execute_command(): starting 41016 1727204184.89759: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204184.8093998-41889-268195268379579/ /root/.ansible/tmp/ansible-tmp-1727204184.8093998-41889-268195268379579/AnsiballZ_command.py && sleep 0' 41016 1727204184.90464: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204184.90564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204184.90597: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204184.90650: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204184.90754: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204184.90907: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204184.92919: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204184.92930: stdout chunk (state=3): >>><<< 41016 1727204184.92940: stderr chunk (state=3): >>><<< 41016 1727204184.92969: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204184.92978: _low_level_execute_command(): starting 41016 1727204184.92987: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204184.8093998-41889-268195268379579/AnsiballZ_command.py && sleep 0' 41016 1727204184.93693: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204184.93720: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204184.93746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204184.93764: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204184.93800: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204184.93857: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204184.93922: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204184.93939: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204184.93972: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204184.94100: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204185.11261: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-24 14:56:25.106165", "end": "2024-09-24 14:56:25.110367", "delta": "0:00:00.004202", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41016 1727204185.13069: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 41016 1727204185.13073: stdout chunk (state=3): >>><<< 41016 1727204185.13080: stderr chunk (state=3): >>><<< 41016 1727204185.13099: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-24 14:56:25.106165", "end": "2024-09-24 14:56:25.110367", "delta": "0:00:00.004202", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204185.13117: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set ethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204184.8093998-41889-268195268379579/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204185.13127: _low_level_execute_command(): starting 41016 1727204185.13130: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204184.8093998-41889-268195268379579/ > /dev/null 2>&1 && sleep 0' 41016 1727204185.13613: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204185.13617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204185.13619: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204185.13621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204185.13672: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204185.13677: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204185.13679: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204185.13749: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204185.15706: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204185.15735: stderr chunk (state=3): >>><<< 41016 1727204185.15738: stdout chunk (state=3): >>><<< 41016 1727204185.15771: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204185.15774: handler run complete 41016 1727204185.15823: Evaluated conditional (False): False 41016 1727204185.15826: attempt loop complete, returning result 41016 1727204185.15828: variable 'item' from source: unknown 41016 1727204185.16013: variable 'item' from source: unknown ok: [managed-node1] => (item=ip link set ethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "ethtest0", "up" ], "delta": "0:00:00.004202", "end": "2024-09-24 14:56:25.110367", "item": "ip link set ethtest0 up", "rc": 0, "start": "2024-09-24 14:56:25.106165" } 41016 1727204185.16126: dumping result to json 41016 1727204185.16129: done dumping result, returning 41016 1727204185.16131: done running TaskExecutor() for managed-node1/TASK: Create veth interface ethtest0 [028d2410-947f-12d5-0ec4-00000000016e] 41016 1727204185.16133: sending task result for task 028d2410-947f-12d5-0ec4-00000000016e 41016 1727204185.16195: done sending task result for task 028d2410-947f-12d5-0ec4-00000000016e 41016 1727204185.16199: WORKER PROCESS EXITING 41016 1727204185.16288: no more pending results, returning what we have 41016 1727204185.16292: results queue empty 41016 1727204185.16293: checking for any_errors_fatal 41016 1727204185.16304: done checking for any_errors_fatal 41016 1727204185.16305: checking for max_fail_percentage 41016 1727204185.16313: done checking for max_fail_percentage 41016 1727204185.16314: checking to see if all hosts have failed and the running result is not ok 41016 1727204185.16315: done checking to see if all hosts have failed 41016 1727204185.16315: getting the remaining hosts for this loop 41016 1727204185.16317: done getting the remaining hosts for this loop 41016 1727204185.16320: getting the next task for host managed-node1 41016 1727204185.16326: done getting next task for host managed-node1 41016 1727204185.16328: ^ task is: TASK: Set up veth as managed by NetworkManager 41016 1727204185.16331: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204185.16334: getting variables 41016 1727204185.16335: in VariableManager get_vars() 41016 1727204185.16372: Calling all_inventory to load vars for managed-node1 41016 1727204185.16375: Calling groups_inventory to load vars for managed-node1 41016 1727204185.16380: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204185.16389: Calling all_plugins_play to load vars for managed-node1 41016 1727204185.16391: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204185.16394: Calling groups_plugins_play to load vars for managed-node1 41016 1727204185.16709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204185.16913: done with get_vars() 41016 1727204185.16924: done getting variables 41016 1727204185.16991: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Tuesday 24 September 2024 14:56:25 -0400 (0:00:01.295) 0:00:08.846 ***** 41016 1727204185.17021: entering _queue_task() for managed-node1/command 41016 1727204185.17255: worker is 1 (out of 1 available) 41016 1727204185.17267: exiting _queue_task() for managed-node1/command 41016 1727204185.17282: done queuing things up, now waiting for results queue to drain 41016 1727204185.17283: waiting for pending results... 41016 1727204185.17451: running TaskExecutor() for managed-node1/TASK: Set up veth as managed by NetworkManager 41016 1727204185.17520: in run() - task 028d2410-947f-12d5-0ec4-00000000016f 41016 1727204185.17530: variable 'ansible_search_path' from source: unknown 41016 1727204185.17534: variable 'ansible_search_path' from source: unknown 41016 1727204185.17566: calling self._execute() 41016 1727204185.17635: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204185.17640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204185.17648: variable 'omit' from source: magic vars 41016 1727204185.17921: variable 'ansible_distribution_major_version' from source: facts 41016 1727204185.17930: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204185.18040: variable 'type' from source: set_fact 41016 1727204185.18043: variable 'state' from source: include params 41016 1727204185.18047: Evaluated conditional (type == 'veth' and state == 'present'): True 41016 1727204185.18054: variable 'omit' from source: magic vars 41016 1727204185.18082: variable 'omit' from source: magic vars 41016 1727204185.18151: variable 'interface' from source: set_fact 41016 1727204185.18166: variable 'omit' from source: magic vars 41016 1727204185.18199: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204185.18230: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204185.18245: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204185.18258: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204185.18270: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204185.18294: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204185.18297: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204185.18299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204185.18370: Set connection var ansible_shell_executable to /bin/sh 41016 1727204185.18374: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204185.18381: Set connection var ansible_shell_type to sh 41016 1727204185.18389: Set connection var ansible_timeout to 10 41016 1727204185.18392: Set connection var ansible_pipelining to False 41016 1727204185.18398: Set connection var ansible_connection to ssh 41016 1727204185.18417: variable 'ansible_shell_executable' from source: unknown 41016 1727204185.18421: variable 'ansible_connection' from source: unknown 41016 1727204185.18424: variable 'ansible_module_compression' from source: unknown 41016 1727204185.18426: variable 'ansible_shell_type' from source: unknown 41016 1727204185.18428: variable 'ansible_shell_executable' from source: unknown 41016 1727204185.18430: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204185.18435: variable 'ansible_pipelining' from source: unknown 41016 1727204185.18437: variable 'ansible_timeout' from source: unknown 41016 1727204185.18439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204185.18540: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204185.18554: variable 'omit' from source: magic vars 41016 1727204185.18558: starting attempt loop 41016 1727204185.18560: running the handler 41016 1727204185.18570: _low_level_execute_command(): starting 41016 1727204185.18578: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204185.19297: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204185.19387: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204185.21171: stdout chunk (state=3): >>>/root <<< 41016 1727204185.21272: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204185.21304: stderr chunk (state=3): >>><<< 41016 1727204185.21307: stdout chunk (state=3): >>><<< 41016 1727204185.21330: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204185.21343: _low_level_execute_command(): starting 41016 1727204185.21359: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204185.2133098-41959-115169817828988 `" && echo ansible-tmp-1727204185.2133098-41959-115169817828988="` echo /root/.ansible/tmp/ansible-tmp-1727204185.2133098-41959-115169817828988 `" ) && sleep 0' 41016 1727204185.21801: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204185.21804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204185.21807: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204185.21819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204185.21864: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204185.21870: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204185.21872: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204185.21952: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204185.24036: stdout chunk (state=3): >>>ansible-tmp-1727204185.2133098-41959-115169817828988=/root/.ansible/tmp/ansible-tmp-1727204185.2133098-41959-115169817828988 <<< 41016 1727204185.24140: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204185.24170: stderr chunk (state=3): >>><<< 41016 1727204185.24173: stdout chunk (state=3): >>><<< 41016 1727204185.24196: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204185.2133098-41959-115169817828988=/root/.ansible/tmp/ansible-tmp-1727204185.2133098-41959-115169817828988 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204185.24225: variable 'ansible_module_compression' from source: unknown 41016 1727204185.24265: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-410168h8uvyln/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41016 1727204185.24296: variable 'ansible_facts' from source: unknown 41016 1727204185.24356: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204185.2133098-41959-115169817828988/AnsiballZ_command.py 41016 1727204185.24460: Sending initial data 41016 1727204185.24464: Sent initial data (156 bytes) 41016 1727204185.24928: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204185.24931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204185.24934: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204185.24936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204185.24982: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204185.24994: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204185.25088: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204185.26818: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204185.26890: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204185.26965: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmp4_d3cq36 /root/.ansible/tmp/ansible-tmp-1727204185.2133098-41959-115169817828988/AnsiballZ_command.py <<< 41016 1727204185.26969: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204185.2133098-41959-115169817828988/AnsiballZ_command.py" <<< 41016 1727204185.27042: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmp4_d3cq36" to remote "/root/.ansible/tmp/ansible-tmp-1727204185.2133098-41959-115169817828988/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204185.2133098-41959-115169817828988/AnsiballZ_command.py" <<< 41016 1727204185.27709: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204185.27754: stderr chunk (state=3): >>><<< 41016 1727204185.27759: stdout chunk (state=3): >>><<< 41016 1727204185.27803: done transferring module to remote 41016 1727204185.27815: _low_level_execute_command(): starting 41016 1727204185.27819: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204185.2133098-41959-115169817828988/ /root/.ansible/tmp/ansible-tmp-1727204185.2133098-41959-115169817828988/AnsiballZ_command.py && sleep 0' 41016 1727204185.28246: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204185.28254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204185.28280: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204185.28283: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204185.28286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 41016 1727204185.28291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204185.28351: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204185.28354: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204185.28434: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204185.30393: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204185.30420: stderr chunk (state=3): >>><<< 41016 1727204185.30423: stdout chunk (state=3): >>><<< 41016 1727204185.30436: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204185.30438: _low_level_execute_command(): starting 41016 1727204185.30443: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204185.2133098-41959-115169817828988/AnsiballZ_command.py && sleep 0' 41016 1727204185.30894: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204185.30897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 41016 1727204185.30900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204185.30902: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204185.30904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204185.30956: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204185.30962: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204185.30964: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204185.31051: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204185.49726: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-24 14:56:25.475804", "end": "2024-09-24 14:56:25.493946", "delta": "0:00:00.018142", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41016 1727204185.51677: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 41016 1727204185.51682: stdout chunk (state=3): >>><<< 41016 1727204185.51684: stderr chunk (state=3): >>><<< 41016 1727204185.51687: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-24 14:56:25.475804", "end": "2024-09-24 14:56:25.493946", "delta": "0:00:00.018142", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204185.51689: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set ethtest0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204185.2133098-41959-115169817828988/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204185.51691: _low_level_execute_command(): starting 41016 1727204185.51693: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204185.2133098-41959-115169817828988/ > /dev/null 2>&1 && sleep 0' 41016 1727204185.52385: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204185.52389: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204185.52392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204185.52395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204185.52397: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204185.52400: stderr chunk (state=3): >>>debug2: match not found <<< 41016 1727204185.52402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204185.52410: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41016 1727204185.52412: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 41016 1727204185.52414: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41016 1727204185.52416: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204185.52418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204185.52420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204185.52422: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204185.52424: stderr chunk (state=3): >>>debug2: match found <<< 41016 1727204185.52425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204185.52495: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204185.52511: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204185.52620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204185.54599: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204185.54707: stderr chunk (state=3): >>><<< 41016 1727204185.54710: stdout chunk (state=3): >>><<< 41016 1727204185.54713: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204185.54716: handler run complete 41016 1727204185.54723: Evaluated conditional (False): False 41016 1727204185.54740: attempt loop complete, returning result 41016 1727204185.54757: _execute() done 41016 1727204185.54764: dumping result to json 41016 1727204185.54774: done dumping result, returning 41016 1727204185.54788: done running TaskExecutor() for managed-node1/TASK: Set up veth as managed by NetworkManager [028d2410-947f-12d5-0ec4-00000000016f] 41016 1727204185.54797: sending task result for task 028d2410-947f-12d5-0ec4-00000000016f 41016 1727204185.54927: done sending task result for task 028d2410-947f-12d5-0ec4-00000000016f 41016 1727204185.54931: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "nmcli", "d", "set", "ethtest0", "managed", "true" ], "delta": "0:00:00.018142", "end": "2024-09-24 14:56:25.493946", "rc": 0, "start": "2024-09-24 14:56:25.475804" } 41016 1727204185.55162: no more pending results, returning what we have 41016 1727204185.55166: results queue empty 41016 1727204185.55168: checking for any_errors_fatal 41016 1727204185.55182: done checking for any_errors_fatal 41016 1727204185.55183: checking for max_fail_percentage 41016 1727204185.55185: done checking for max_fail_percentage 41016 1727204185.55186: checking to see if all hosts have failed and the running result is not ok 41016 1727204185.55187: done checking to see if all hosts have failed 41016 1727204185.55188: getting the remaining hosts for this loop 41016 1727204185.55190: done getting the remaining hosts for this loop 41016 1727204185.55194: getting the next task for host managed-node1 41016 1727204185.55200: done getting next task for host managed-node1 41016 1727204185.55203: ^ task is: TASK: Delete veth interface {{ interface }} 41016 1727204185.55207: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204185.55212: getting variables 41016 1727204185.55214: in VariableManager get_vars() 41016 1727204185.55254: Calling all_inventory to load vars for managed-node1 41016 1727204185.55257: Calling groups_inventory to load vars for managed-node1 41016 1727204185.55259: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204185.55268: Calling all_plugins_play to load vars for managed-node1 41016 1727204185.55271: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204185.55273: Calling groups_plugins_play to load vars for managed-node1 41016 1727204185.55751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204185.55969: done with get_vars() 41016 1727204185.55983: done getting variables 41016 1727204185.56048: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41016 1727204185.56170: variable 'interface' from source: set_fact TASK [Delete veth interface ethtest0] ****************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Tuesday 24 September 2024 14:56:25 -0400 (0:00:00.391) 0:00:09.238 ***** 41016 1727204185.56208: entering _queue_task() for managed-node1/command 41016 1727204185.56610: worker is 1 (out of 1 available) 41016 1727204185.56620: exiting _queue_task() for managed-node1/command 41016 1727204185.56631: done queuing things up, now waiting for results queue to drain 41016 1727204185.56632: waiting for pending results... 41016 1727204185.56920: running TaskExecutor() for managed-node1/TASK: Delete veth interface ethtest0 41016 1727204185.56969: in run() - task 028d2410-947f-12d5-0ec4-000000000170 41016 1727204185.56973: variable 'ansible_search_path' from source: unknown 41016 1727204185.56979: variable 'ansible_search_path' from source: unknown 41016 1727204185.57008: calling self._execute() 41016 1727204185.57111: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204185.57128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204185.57144: variable 'omit' from source: magic vars 41016 1727204185.57562: variable 'ansible_distribution_major_version' from source: facts 41016 1727204185.57566: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204185.57767: variable 'type' from source: set_fact 41016 1727204185.57783: variable 'state' from source: include params 41016 1727204185.57791: variable 'interface' from source: set_fact 41016 1727204185.57837: variable 'current_interfaces' from source: set_fact 41016 1727204185.57840: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 41016 1727204185.57842: when evaluation is False, skipping this task 41016 1727204185.57844: _execute() done 41016 1727204185.57846: dumping result to json 41016 1727204185.57848: done dumping result, returning 41016 1727204185.57849: done running TaskExecutor() for managed-node1/TASK: Delete veth interface ethtest0 [028d2410-947f-12d5-0ec4-000000000170] 41016 1727204185.57851: sending task result for task 028d2410-947f-12d5-0ec4-000000000170 skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 41016 1727204185.57987: no more pending results, returning what we have 41016 1727204185.57993: results queue empty 41016 1727204185.57994: checking for any_errors_fatal 41016 1727204185.58003: done checking for any_errors_fatal 41016 1727204185.58004: checking for max_fail_percentage 41016 1727204185.58006: done checking for max_fail_percentage 41016 1727204185.58008: checking to see if all hosts have failed and the running result is not ok 41016 1727204185.58008: done checking to see if all hosts have failed 41016 1727204185.58009: getting the remaining hosts for this loop 41016 1727204185.58010: done getting the remaining hosts for this loop 41016 1727204185.58014: getting the next task for host managed-node1 41016 1727204185.58022: done getting next task for host managed-node1 41016 1727204185.58025: ^ task is: TASK: Create dummy interface {{ interface }} 41016 1727204185.58028: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204185.58033: getting variables 41016 1727204185.58034: in VariableManager get_vars() 41016 1727204185.58078: Calling all_inventory to load vars for managed-node1 41016 1727204185.58082: Calling groups_inventory to load vars for managed-node1 41016 1727204185.58084: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204185.58096: Calling all_plugins_play to load vars for managed-node1 41016 1727204185.58099: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204185.58101: Calling groups_plugins_play to load vars for managed-node1 41016 1727204185.58464: done sending task result for task 028d2410-947f-12d5-0ec4-000000000170 41016 1727204185.58468: WORKER PROCESS EXITING 41016 1727204185.58493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204185.58812: done with get_vars() 41016 1727204185.58821: done getting variables 41016 1727204185.58883: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41016 1727204185.58996: variable 'interface' from source: set_fact TASK [Create dummy interface ethtest0] ***************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Tuesday 24 September 2024 14:56:25 -0400 (0:00:00.028) 0:00:09.266 ***** 41016 1727204185.59023: entering _queue_task() for managed-node1/command 41016 1727204185.59382: worker is 1 (out of 1 available) 41016 1727204185.59392: exiting _queue_task() for managed-node1/command 41016 1727204185.59403: done queuing things up, now waiting for results queue to drain 41016 1727204185.59404: waiting for pending results... 41016 1727204185.59549: running TaskExecutor() for managed-node1/TASK: Create dummy interface ethtest0 41016 1727204185.59682: in run() - task 028d2410-947f-12d5-0ec4-000000000171 41016 1727204185.59686: variable 'ansible_search_path' from source: unknown 41016 1727204185.59689: variable 'ansible_search_path' from source: unknown 41016 1727204185.59819: calling self._execute() 41016 1727204185.59823: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204185.59832: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204185.59850: variable 'omit' from source: magic vars 41016 1727204185.60221: variable 'ansible_distribution_major_version' from source: facts 41016 1727204185.60238: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204185.60469: variable 'type' from source: set_fact 41016 1727204185.60482: variable 'state' from source: include params 41016 1727204185.60493: variable 'interface' from source: set_fact 41016 1727204185.60504: variable 'current_interfaces' from source: set_fact 41016 1727204185.60517: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 41016 1727204185.60524: when evaluation is False, skipping this task 41016 1727204185.60531: _execute() done 41016 1727204185.60539: dumping result to json 41016 1727204185.60583: done dumping result, returning 41016 1727204185.60586: done running TaskExecutor() for managed-node1/TASK: Create dummy interface ethtest0 [028d2410-947f-12d5-0ec4-000000000171] 41016 1727204185.60589: sending task result for task 028d2410-947f-12d5-0ec4-000000000171 skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 41016 1727204185.60734: no more pending results, returning what we have 41016 1727204185.60739: results queue empty 41016 1727204185.60740: checking for any_errors_fatal 41016 1727204185.60745: done checking for any_errors_fatal 41016 1727204185.60746: checking for max_fail_percentage 41016 1727204185.60747: done checking for max_fail_percentage 41016 1727204185.60748: checking to see if all hosts have failed and the running result is not ok 41016 1727204185.60749: done checking to see if all hosts have failed 41016 1727204185.60750: getting the remaining hosts for this loop 41016 1727204185.60751: done getting the remaining hosts for this loop 41016 1727204185.60755: getting the next task for host managed-node1 41016 1727204185.60761: done getting next task for host managed-node1 41016 1727204185.60765: ^ task is: TASK: Delete dummy interface {{ interface }} 41016 1727204185.60768: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204185.60771: getting variables 41016 1727204185.60773: in VariableManager get_vars() 41016 1727204185.60816: Calling all_inventory to load vars for managed-node1 41016 1727204185.60819: Calling groups_inventory to load vars for managed-node1 41016 1727204185.60822: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204185.60834: Calling all_plugins_play to load vars for managed-node1 41016 1727204185.60837: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204185.60840: Calling groups_plugins_play to load vars for managed-node1 41016 1727204185.61204: done sending task result for task 028d2410-947f-12d5-0ec4-000000000171 41016 1727204185.61207: WORKER PROCESS EXITING 41016 1727204185.61229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204185.61408: done with get_vars() 41016 1727204185.61423: done getting variables 41016 1727204185.61481: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41016 1727204185.61605: variable 'interface' from source: set_fact TASK [Delete dummy interface ethtest0] ***************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Tuesday 24 September 2024 14:56:25 -0400 (0:00:00.026) 0:00:09.292 ***** 41016 1727204185.61641: entering _queue_task() for managed-node1/command 41016 1727204185.61926: worker is 1 (out of 1 available) 41016 1727204185.61939: exiting _queue_task() for managed-node1/command 41016 1727204185.61951: done queuing things up, now waiting for results queue to drain 41016 1727204185.61952: waiting for pending results... 41016 1727204185.62228: running TaskExecutor() for managed-node1/TASK: Delete dummy interface ethtest0 41016 1727204185.62337: in run() - task 028d2410-947f-12d5-0ec4-000000000172 41016 1727204185.62360: variable 'ansible_search_path' from source: unknown 41016 1727204185.62368: variable 'ansible_search_path' from source: unknown 41016 1727204185.62420: calling self._execute() 41016 1727204185.62512: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204185.62527: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204185.62541: variable 'omit' from source: magic vars 41016 1727204185.62915: variable 'ansible_distribution_major_version' from source: facts 41016 1727204185.62929: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204185.63117: variable 'type' from source: set_fact 41016 1727204185.63126: variable 'state' from source: include params 41016 1727204185.63169: variable 'interface' from source: set_fact 41016 1727204185.63172: variable 'current_interfaces' from source: set_fact 41016 1727204185.63174: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 41016 1727204185.63178: when evaluation is False, skipping this task 41016 1727204185.63180: _execute() done 41016 1727204185.63182: dumping result to json 41016 1727204185.63183: done dumping result, returning 41016 1727204185.63189: done running TaskExecutor() for managed-node1/TASK: Delete dummy interface ethtest0 [028d2410-947f-12d5-0ec4-000000000172] 41016 1727204185.63197: sending task result for task 028d2410-947f-12d5-0ec4-000000000172 skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 41016 1727204185.63438: no more pending results, returning what we have 41016 1727204185.63443: results queue empty 41016 1727204185.63444: checking for any_errors_fatal 41016 1727204185.63450: done checking for any_errors_fatal 41016 1727204185.63451: checking for max_fail_percentage 41016 1727204185.63453: done checking for max_fail_percentage 41016 1727204185.63454: checking to see if all hosts have failed and the running result is not ok 41016 1727204185.63455: done checking to see if all hosts have failed 41016 1727204185.63456: getting the remaining hosts for this loop 41016 1727204185.63457: done getting the remaining hosts for this loop 41016 1727204185.63460: getting the next task for host managed-node1 41016 1727204185.63468: done getting next task for host managed-node1 41016 1727204185.63470: ^ task is: TASK: Create tap interface {{ interface }} 41016 1727204185.63474: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204185.63480: getting variables 41016 1727204185.63482: in VariableManager get_vars() 41016 1727204185.63530: Calling all_inventory to load vars for managed-node1 41016 1727204185.63534: Calling groups_inventory to load vars for managed-node1 41016 1727204185.63537: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204185.63549: Calling all_plugins_play to load vars for managed-node1 41016 1727204185.63552: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204185.63554: Calling groups_plugins_play to load vars for managed-node1 41016 1727204185.63877: done sending task result for task 028d2410-947f-12d5-0ec4-000000000172 41016 1727204185.63881: WORKER PROCESS EXITING 41016 1727204185.63910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204185.64115: done with get_vars() 41016 1727204185.64131: done getting variables 41016 1727204185.64185: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41016 1727204185.64292: variable 'interface' from source: set_fact TASK [Create tap interface ethtest0] ******************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Tuesday 24 September 2024 14:56:25 -0400 (0:00:00.026) 0:00:09.319 ***** 41016 1727204185.64320: entering _queue_task() for managed-node1/command 41016 1727204185.64550: worker is 1 (out of 1 available) 41016 1727204185.64681: exiting _queue_task() for managed-node1/command 41016 1727204185.64692: done queuing things up, now waiting for results queue to drain 41016 1727204185.64694: waiting for pending results... 41016 1727204185.64845: running TaskExecutor() for managed-node1/TASK: Create tap interface ethtest0 41016 1727204185.64962: in run() - task 028d2410-947f-12d5-0ec4-000000000173 41016 1727204185.64985: variable 'ansible_search_path' from source: unknown 41016 1727204185.64993: variable 'ansible_search_path' from source: unknown 41016 1727204185.65045: calling self._execute() 41016 1727204185.65140: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204185.65152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204185.65166: variable 'omit' from source: magic vars 41016 1727204185.65556: variable 'ansible_distribution_major_version' from source: facts 41016 1727204185.65579: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204185.65910: variable 'type' from source: set_fact 41016 1727204185.65913: variable 'state' from source: include params 41016 1727204185.65915: variable 'interface' from source: set_fact 41016 1727204185.65918: variable 'current_interfaces' from source: set_fact 41016 1727204185.65921: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 41016 1727204185.65923: when evaluation is False, skipping this task 41016 1727204185.65925: _execute() done 41016 1727204185.65927: dumping result to json 41016 1727204185.65929: done dumping result, returning 41016 1727204185.65931: done running TaskExecutor() for managed-node1/TASK: Create tap interface ethtest0 [028d2410-947f-12d5-0ec4-000000000173] 41016 1727204185.65934: sending task result for task 028d2410-947f-12d5-0ec4-000000000173 41016 1727204185.66111: done sending task result for task 028d2410-947f-12d5-0ec4-000000000173 41016 1727204185.66114: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 41016 1727204185.66163: no more pending results, returning what we have 41016 1727204185.66167: results queue empty 41016 1727204185.66169: checking for any_errors_fatal 41016 1727204185.66179: done checking for any_errors_fatal 41016 1727204185.66180: checking for max_fail_percentage 41016 1727204185.66182: done checking for max_fail_percentage 41016 1727204185.66183: checking to see if all hosts have failed and the running result is not ok 41016 1727204185.66184: done checking to see if all hosts have failed 41016 1727204185.66185: getting the remaining hosts for this loop 41016 1727204185.66186: done getting the remaining hosts for this loop 41016 1727204185.66190: getting the next task for host managed-node1 41016 1727204185.66234: done getting next task for host managed-node1 41016 1727204185.66238: ^ task is: TASK: Delete tap interface {{ interface }} 41016 1727204185.66241: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204185.66245: getting variables 41016 1727204185.66248: in VariableManager get_vars() 41016 1727204185.66292: Calling all_inventory to load vars for managed-node1 41016 1727204185.66294: Calling groups_inventory to load vars for managed-node1 41016 1727204185.66297: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204185.66311: Calling all_plugins_play to load vars for managed-node1 41016 1727204185.66314: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204185.66317: Calling groups_plugins_play to load vars for managed-node1 41016 1727204185.66693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204185.66903: done with get_vars() 41016 1727204185.66913: done getting variables 41016 1727204185.66980: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41016 1727204185.67097: variable 'interface' from source: set_fact TASK [Delete tap interface ethtest0] ******************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Tuesday 24 September 2024 14:56:25 -0400 (0:00:00.028) 0:00:09.347 ***** 41016 1727204185.67127: entering _queue_task() for managed-node1/command 41016 1727204185.67593: worker is 1 (out of 1 available) 41016 1727204185.67604: exiting _queue_task() for managed-node1/command 41016 1727204185.67618: done queuing things up, now waiting for results queue to drain 41016 1727204185.67620: waiting for pending results... 41016 1727204185.67851: running TaskExecutor() for managed-node1/TASK: Delete tap interface ethtest0 41016 1727204185.67982: in run() - task 028d2410-947f-12d5-0ec4-000000000174 41016 1727204185.67986: variable 'ansible_search_path' from source: unknown 41016 1727204185.67989: variable 'ansible_search_path' from source: unknown 41016 1727204185.67992: calling self._execute() 41016 1727204185.68084: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204185.68091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204185.68111: variable 'omit' from source: magic vars 41016 1727204185.68482: variable 'ansible_distribution_major_version' from source: facts 41016 1727204185.68492: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204185.68632: variable 'type' from source: set_fact 41016 1727204185.68636: variable 'state' from source: include params 41016 1727204185.68639: variable 'interface' from source: set_fact 41016 1727204185.68644: variable 'current_interfaces' from source: set_fact 41016 1727204185.68651: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 41016 1727204185.68654: when evaluation is False, skipping this task 41016 1727204185.68656: _execute() done 41016 1727204185.68658: dumping result to json 41016 1727204185.68663: done dumping result, returning 41016 1727204185.68668: done running TaskExecutor() for managed-node1/TASK: Delete tap interface ethtest0 [028d2410-947f-12d5-0ec4-000000000174] 41016 1727204185.68673: sending task result for task 028d2410-947f-12d5-0ec4-000000000174 41016 1727204185.68750: done sending task result for task 028d2410-947f-12d5-0ec4-000000000174 41016 1727204185.68753: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 41016 1727204185.68802: no more pending results, returning what we have 41016 1727204185.68806: results queue empty 41016 1727204185.68807: checking for any_errors_fatal 41016 1727204185.68814: done checking for any_errors_fatal 41016 1727204185.68815: checking for max_fail_percentage 41016 1727204185.68817: done checking for max_fail_percentage 41016 1727204185.68818: checking to see if all hosts have failed and the running result is not ok 41016 1727204185.68819: done checking to see if all hosts have failed 41016 1727204185.68819: getting the remaining hosts for this loop 41016 1727204185.68820: done getting the remaining hosts for this loop 41016 1727204185.68824: getting the next task for host managed-node1 41016 1727204185.68831: done getting next task for host managed-node1 41016 1727204185.68834: ^ task is: TASK: Assert device is present 41016 1727204185.68836: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204185.68839: getting variables 41016 1727204185.68840: in VariableManager get_vars() 41016 1727204185.68874: Calling all_inventory to load vars for managed-node1 41016 1727204185.68878: Calling groups_inventory to load vars for managed-node1 41016 1727204185.68880: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204185.68890: Calling all_plugins_play to load vars for managed-node1 41016 1727204185.68892: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204185.68894: Calling groups_plugins_play to load vars for managed-node1 41016 1727204185.69052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204185.69168: done with get_vars() 41016 1727204185.69177: done getting variables TASK [Assert device is present] ************************************************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:21 Tuesday 24 September 2024 14:56:25 -0400 (0:00:00.021) 0:00:09.368 ***** 41016 1727204185.69237: entering _queue_task() for managed-node1/include_tasks 41016 1727204185.69423: worker is 1 (out of 1 available) 41016 1727204185.69433: exiting _queue_task() for managed-node1/include_tasks 41016 1727204185.69445: done queuing things up, now waiting for results queue to drain 41016 1727204185.69446: waiting for pending results... 41016 1727204185.69597: running TaskExecutor() for managed-node1/TASK: Assert device is present 41016 1727204185.69654: in run() - task 028d2410-947f-12d5-0ec4-00000000000e 41016 1727204185.69666: variable 'ansible_search_path' from source: unknown 41016 1727204185.69695: calling self._execute() 41016 1727204185.69840: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204185.69844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204185.69846: variable 'omit' from source: magic vars 41016 1727204185.70472: variable 'ansible_distribution_major_version' from source: facts 41016 1727204185.70474: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204185.70479: _execute() done 41016 1727204185.70481: dumping result to json 41016 1727204185.70483: done dumping result, returning 41016 1727204185.70486: done running TaskExecutor() for managed-node1/TASK: Assert device is present [028d2410-947f-12d5-0ec4-00000000000e] 41016 1727204185.70488: sending task result for task 028d2410-947f-12d5-0ec4-00000000000e 41016 1727204185.70546: done sending task result for task 028d2410-947f-12d5-0ec4-00000000000e 41016 1727204185.70548: WORKER PROCESS EXITING 41016 1727204185.70569: no more pending results, returning what we have 41016 1727204185.70577: in VariableManager get_vars() 41016 1727204185.70611: Calling all_inventory to load vars for managed-node1 41016 1727204185.70614: Calling groups_inventory to load vars for managed-node1 41016 1727204185.70615: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204185.70625: Calling all_plugins_play to load vars for managed-node1 41016 1727204185.70627: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204185.70630: Calling groups_plugins_play to load vars for managed-node1 41016 1727204185.70873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204185.71121: done with get_vars() 41016 1727204185.71127: variable 'ansible_search_path' from source: unknown 41016 1727204185.71139: we have included files to process 41016 1727204185.71140: generating all_blocks data 41016 1727204185.71142: done generating all_blocks data 41016 1727204185.71146: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 41016 1727204185.71147: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 41016 1727204185.71150: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 41016 1727204185.71307: in VariableManager get_vars() 41016 1727204185.71337: done with get_vars() 41016 1727204185.71452: done processing included file 41016 1727204185.71455: iterating over new_blocks loaded from include file 41016 1727204185.71457: in VariableManager get_vars() 41016 1727204185.71473: done with get_vars() 41016 1727204185.71478: filtering new block on tags 41016 1727204185.71495: done filtering new block on tags 41016 1727204185.71497: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed-node1 41016 1727204185.71503: extending task lists for all hosts with included blocks 41016 1727204185.72533: done extending task lists 41016 1727204185.72535: done processing included files 41016 1727204185.72536: results queue empty 41016 1727204185.72536: checking for any_errors_fatal 41016 1727204185.72538: done checking for any_errors_fatal 41016 1727204185.72539: checking for max_fail_percentage 41016 1727204185.72540: done checking for max_fail_percentage 41016 1727204185.72541: checking to see if all hosts have failed and the running result is not ok 41016 1727204185.72542: done checking to see if all hosts have failed 41016 1727204185.72543: getting the remaining hosts for this loop 41016 1727204185.72544: done getting the remaining hosts for this loop 41016 1727204185.72546: getting the next task for host managed-node1 41016 1727204185.72550: done getting next task for host managed-node1 41016 1727204185.72552: ^ task is: TASK: Include the task 'get_interface_stat.yml' 41016 1727204185.72554: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204185.72556: getting variables 41016 1727204185.72557: in VariableManager get_vars() 41016 1727204185.72570: Calling all_inventory to load vars for managed-node1 41016 1727204185.72572: Calling groups_inventory to load vars for managed-node1 41016 1727204185.72574: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204185.72582: Calling all_plugins_play to load vars for managed-node1 41016 1727204185.72584: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204185.72587: Calling groups_plugins_play to load vars for managed-node1 41016 1727204185.72750: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204185.72958: done with get_vars() 41016 1727204185.72967: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:56:25 -0400 (0:00:00.037) 0:00:09.406 ***** 41016 1727204185.73039: entering _queue_task() for managed-node1/include_tasks 41016 1727204185.73510: worker is 1 (out of 1 available) 41016 1727204185.73519: exiting _queue_task() for managed-node1/include_tasks 41016 1727204185.73528: done queuing things up, now waiting for results queue to drain 41016 1727204185.73529: waiting for pending results... 41016 1727204185.73770: running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' 41016 1727204185.73774: in run() - task 028d2410-947f-12d5-0ec4-000000000214 41016 1727204185.73778: variable 'ansible_search_path' from source: unknown 41016 1727204185.73781: variable 'ansible_search_path' from source: unknown 41016 1727204185.73784: calling self._execute() 41016 1727204185.73868: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204185.73883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204185.73896: variable 'omit' from source: magic vars 41016 1727204185.74283: variable 'ansible_distribution_major_version' from source: facts 41016 1727204185.74311: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204185.74321: _execute() done 41016 1727204185.74328: dumping result to json 41016 1727204185.74335: done dumping result, returning 41016 1727204185.74344: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' [028d2410-947f-12d5-0ec4-000000000214] 41016 1727204185.74353: sending task result for task 028d2410-947f-12d5-0ec4-000000000214 41016 1727204185.74627: done sending task result for task 028d2410-947f-12d5-0ec4-000000000214 41016 1727204185.74652: no more pending results, returning what we have 41016 1727204185.74656: in VariableManager get_vars() 41016 1727204185.74697: Calling all_inventory to load vars for managed-node1 41016 1727204185.74700: Calling groups_inventory to load vars for managed-node1 41016 1727204185.74703: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204185.74715: Calling all_plugins_play to load vars for managed-node1 41016 1727204185.74718: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204185.74720: Calling groups_plugins_play to load vars for managed-node1 41016 1727204185.75000: WORKER PROCESS EXITING 41016 1727204185.75026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204185.75282: done with get_vars() 41016 1727204185.75290: variable 'ansible_search_path' from source: unknown 41016 1727204185.75291: variable 'ansible_search_path' from source: unknown 41016 1727204185.75393: we have included files to process 41016 1727204185.75394: generating all_blocks data 41016 1727204185.75395: done generating all_blocks data 41016 1727204185.75397: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 41016 1727204185.75398: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 41016 1727204185.75400: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 41016 1727204185.76043: done processing included file 41016 1727204185.76045: iterating over new_blocks loaded from include file 41016 1727204185.76047: in VariableManager get_vars() 41016 1727204185.76064: done with get_vars() 41016 1727204185.76066: filtering new block on tags 41016 1727204185.76185: done filtering new block on tags 41016 1727204185.76196: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node1 41016 1727204185.76201: extending task lists for all hosts with included blocks 41016 1727204185.76419: done extending task lists 41016 1727204185.76421: done processing included files 41016 1727204185.76422: results queue empty 41016 1727204185.76422: checking for any_errors_fatal 41016 1727204185.76425: done checking for any_errors_fatal 41016 1727204185.76426: checking for max_fail_percentage 41016 1727204185.76427: done checking for max_fail_percentage 41016 1727204185.76427: checking to see if all hosts have failed and the running result is not ok 41016 1727204185.76428: done checking to see if all hosts have failed 41016 1727204185.76429: getting the remaining hosts for this loop 41016 1727204185.76430: done getting the remaining hosts for this loop 41016 1727204185.76432: getting the next task for host managed-node1 41016 1727204185.76437: done getting next task for host managed-node1 41016 1727204185.76439: ^ task is: TASK: Get stat for interface {{ interface }} 41016 1727204185.76442: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204185.76444: getting variables 41016 1727204185.76445: in VariableManager get_vars() 41016 1727204185.76457: Calling all_inventory to load vars for managed-node1 41016 1727204185.76460: Calling groups_inventory to load vars for managed-node1 41016 1727204185.76461: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204185.76466: Calling all_plugins_play to load vars for managed-node1 41016 1727204185.76469: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204185.76471: Calling groups_plugins_play to load vars for managed-node1 41016 1727204185.76836: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204185.77315: done with get_vars() 41016 1727204185.77439: done getting variables 41016 1727204185.77695: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:56:25 -0400 (0:00:00.046) 0:00:09.453 ***** 41016 1727204185.77724: entering _queue_task() for managed-node1/stat 41016 1727204185.78353: worker is 1 (out of 1 available) 41016 1727204185.78366: exiting _queue_task() for managed-node1/stat 41016 1727204185.78381: done queuing things up, now waiting for results queue to drain 41016 1727204185.78382: waiting for pending results... 41016 1727204185.78774: running TaskExecutor() for managed-node1/TASK: Get stat for interface ethtest0 41016 1727204185.79059: in run() - task 028d2410-947f-12d5-0ec4-000000000267 41016 1727204185.79063: variable 'ansible_search_path' from source: unknown 41016 1727204185.79066: variable 'ansible_search_path' from source: unknown 41016 1727204185.79102: calling self._execute() 41016 1727204185.79292: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204185.79295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204185.79306: variable 'omit' from source: magic vars 41016 1727204185.80088: variable 'ansible_distribution_major_version' from source: facts 41016 1727204185.80151: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204185.80155: variable 'omit' from source: magic vars 41016 1727204185.80158: variable 'omit' from source: magic vars 41016 1727204185.80517: variable 'interface' from source: set_fact 41016 1727204185.80539: variable 'omit' from source: magic vars 41016 1727204185.80586: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204185.80615: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204185.80804: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204185.80808: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204185.80813: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204185.80816: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204185.80819: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204185.80821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204185.81112: Set connection var ansible_shell_executable to /bin/sh 41016 1727204185.81115: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204185.81132: Set connection var ansible_shell_type to sh 41016 1727204185.81135: Set connection var ansible_timeout to 10 41016 1727204185.81138: Set connection var ansible_pipelining to False 41016 1727204185.81140: Set connection var ansible_connection to ssh 41016 1727204185.81242: variable 'ansible_shell_executable' from source: unknown 41016 1727204185.81246: variable 'ansible_connection' from source: unknown 41016 1727204185.81249: variable 'ansible_module_compression' from source: unknown 41016 1727204185.81251: variable 'ansible_shell_type' from source: unknown 41016 1727204185.81253: variable 'ansible_shell_executable' from source: unknown 41016 1727204185.81256: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204185.81258: variable 'ansible_pipelining' from source: unknown 41016 1727204185.81260: variable 'ansible_timeout' from source: unknown 41016 1727204185.81285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204185.81680: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 41016 1727204185.81691: variable 'omit' from source: magic vars 41016 1727204185.81697: starting attempt loop 41016 1727204185.81700: running the handler 41016 1727204185.81716: _low_level_execute_command(): starting 41016 1727204185.81838: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204185.83382: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 41016 1727204185.83386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204185.83426: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204185.83430: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204185.83462: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204185.83615: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204185.85387: stdout chunk (state=3): >>>/root <<< 41016 1727204185.85557: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204185.85560: stdout chunk (state=3): >>><<< 41016 1727204185.85563: stderr chunk (state=3): >>><<< 41016 1727204185.85686: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204185.85690: _low_level_execute_command(): starting 41016 1727204185.85693: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204185.855864-41984-269067354661913 `" && echo ansible-tmp-1727204185.855864-41984-269067354661913="` echo /root/.ansible/tmp/ansible-tmp-1727204185.855864-41984-269067354661913 `" ) && sleep 0' 41016 1727204185.86290: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204185.86342: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204185.86448: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204185.86461: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204185.86700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204185.88902: stdout chunk (state=3): >>>ansible-tmp-1727204185.855864-41984-269067354661913=/root/.ansible/tmp/ansible-tmp-1727204185.855864-41984-269067354661913 <<< 41016 1727204185.88937: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204185.88940: stdout chunk (state=3): >>><<< 41016 1727204185.88943: stderr chunk (state=3): >>><<< 41016 1727204185.88964: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204185.855864-41984-269067354661913=/root/.ansible/tmp/ansible-tmp-1727204185.855864-41984-269067354661913 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204185.89087: variable 'ansible_module_compression' from source: unknown 41016 1727204185.89198: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-410168h8uvyln/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 41016 1727204185.89325: variable 'ansible_facts' from source: unknown 41016 1727204185.89483: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204185.855864-41984-269067354661913/AnsiballZ_stat.py 41016 1727204185.90041: Sending initial data 41016 1727204185.90044: Sent initial data (152 bytes) 41016 1727204185.91194: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204185.91396: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204185.91446: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204185.91618: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204185.93382: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 41016 1727204185.93422: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204185.93497: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204185.93610: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmpfyhdpgql /root/.ansible/tmp/ansible-tmp-1727204185.855864-41984-269067354661913/AnsiballZ_stat.py <<< 41016 1727204185.93614: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204185.855864-41984-269067354661913/AnsiballZ_stat.py" <<< 41016 1727204185.93702: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmpfyhdpgql" to remote "/root/.ansible/tmp/ansible-tmp-1727204185.855864-41984-269067354661913/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204185.855864-41984-269067354661913/AnsiballZ_stat.py" <<< 41016 1727204185.94896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204185.94937: stderr chunk (state=3): >>><<< 41016 1727204185.94943: stdout chunk (state=3): >>><<< 41016 1727204185.94961: done transferring module to remote 41016 1727204185.94970: _low_level_execute_command(): starting 41016 1727204185.94974: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204185.855864-41984-269067354661913/ /root/.ansible/tmp/ansible-tmp-1727204185.855864-41984-269067354661913/AnsiballZ_stat.py && sleep 0' 41016 1727204185.95591: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204185.95637: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204185.95652: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204185.95671: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204185.95785: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204185.97842: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204185.97846: stdout chunk (state=3): >>><<< 41016 1727204185.97848: stderr chunk (state=3): >>><<< 41016 1727204185.97882: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204185.97890: _low_level_execute_command(): starting 41016 1727204185.97893: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204185.855864-41984-269067354661913/AnsiballZ_stat.py && sleep 0' 41016 1727204185.98724: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204185.98736: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204185.98751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204185.98780: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204185.98927: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204186.15458: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 30208, "dev": 23, "nlink": 1, "atime": 1727204184.2187715, "mtime": 1727204184.2187715, "ctime": 1727204184.2187715, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 41016 1727204186.16964: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 41016 1727204186.16990: stderr chunk (state=3): >>><<< 41016 1727204186.16993: stdout chunk (state=3): >>><<< 41016 1727204186.17009: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 30208, "dev": 23, "nlink": 1, "atime": 1727204184.2187715, "mtime": 1727204184.2187715, "ctime": 1727204184.2187715, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204186.17052: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204185.855864-41984-269067354661913/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204186.17059: _low_level_execute_command(): starting 41016 1727204186.17064: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204185.855864-41984-269067354661913/ > /dev/null 2>&1 && sleep 0' 41016 1727204186.17513: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204186.17517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204186.17522: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 41016 1727204186.17524: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204186.17526: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204186.17570: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204186.17573: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204186.17579: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204186.17654: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204186.19621: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204186.19644: stderr chunk (state=3): >>><<< 41016 1727204186.19647: stdout chunk (state=3): >>><<< 41016 1727204186.19660: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204186.19669: handler run complete 41016 1727204186.19702: attempt loop complete, returning result 41016 1727204186.19705: _execute() done 41016 1727204186.19707: dumping result to json 41016 1727204186.19712: done dumping result, returning 41016 1727204186.19721: done running TaskExecutor() for managed-node1/TASK: Get stat for interface ethtest0 [028d2410-947f-12d5-0ec4-000000000267] 41016 1727204186.19723: sending task result for task 028d2410-947f-12d5-0ec4-000000000267 41016 1727204186.19822: done sending task result for task 028d2410-947f-12d5-0ec4-000000000267 41016 1727204186.19824: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "atime": 1727204184.2187715, "block_size": 4096, "blocks": 0, "ctime": 1727204184.2187715, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 30208, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "mode": "0777", "mtime": 1727204184.2187715, "nlink": 1, "path": "/sys/class/net/ethtest0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 41016 1727204186.19913: no more pending results, returning what we have 41016 1727204186.19916: results queue empty 41016 1727204186.19917: checking for any_errors_fatal 41016 1727204186.19918: done checking for any_errors_fatal 41016 1727204186.19919: checking for max_fail_percentage 41016 1727204186.19920: done checking for max_fail_percentage 41016 1727204186.19921: checking to see if all hosts have failed and the running result is not ok 41016 1727204186.19922: done checking to see if all hosts have failed 41016 1727204186.19923: getting the remaining hosts for this loop 41016 1727204186.19924: done getting the remaining hosts for this loop 41016 1727204186.19928: getting the next task for host managed-node1 41016 1727204186.19937: done getting next task for host managed-node1 41016 1727204186.19940: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 41016 1727204186.19942: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204186.19946: getting variables 41016 1727204186.19948: in VariableManager get_vars() 41016 1727204186.19986: Calling all_inventory to load vars for managed-node1 41016 1727204186.19988: Calling groups_inventory to load vars for managed-node1 41016 1727204186.19990: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204186.20000: Calling all_plugins_play to load vars for managed-node1 41016 1727204186.20002: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204186.20005: Calling groups_plugins_play to load vars for managed-node1 41016 1727204186.20139: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204186.20261: done with get_vars() 41016 1727204186.20269: done getting variables 41016 1727204186.20343: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 41016 1727204186.20435: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'ethtest0'] *********************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:56:26 -0400 (0:00:00.427) 0:00:09.880 ***** 41016 1727204186.20456: entering _queue_task() for managed-node1/assert 41016 1727204186.20458: Creating lock for assert 41016 1727204186.20668: worker is 1 (out of 1 available) 41016 1727204186.20682: exiting _queue_task() for managed-node1/assert 41016 1727204186.20694: done queuing things up, now waiting for results queue to drain 41016 1727204186.20695: waiting for pending results... 41016 1727204186.20858: running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'ethtest0' 41016 1727204186.20924: in run() - task 028d2410-947f-12d5-0ec4-000000000215 41016 1727204186.20942: variable 'ansible_search_path' from source: unknown 41016 1727204186.20946: variable 'ansible_search_path' from source: unknown 41016 1727204186.20969: calling self._execute() 41016 1727204186.21052: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204186.21056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204186.21085: variable 'omit' from source: magic vars 41016 1727204186.21581: variable 'ansible_distribution_major_version' from source: facts 41016 1727204186.21584: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204186.21586: variable 'omit' from source: magic vars 41016 1727204186.21589: variable 'omit' from source: magic vars 41016 1727204186.21663: variable 'interface' from source: set_fact 41016 1727204186.21691: variable 'omit' from source: magic vars 41016 1727204186.21737: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204186.21782: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204186.21799: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204186.21833: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204186.21837: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204186.21870: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204186.21873: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204186.21877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204186.22187: Set connection var ansible_shell_executable to /bin/sh 41016 1727204186.22190: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204186.22193: Set connection var ansible_shell_type to sh 41016 1727204186.22195: Set connection var ansible_timeout to 10 41016 1727204186.22198: Set connection var ansible_pipelining to False 41016 1727204186.22200: Set connection var ansible_connection to ssh 41016 1727204186.22202: variable 'ansible_shell_executable' from source: unknown 41016 1727204186.22204: variable 'ansible_connection' from source: unknown 41016 1727204186.22206: variable 'ansible_module_compression' from source: unknown 41016 1727204186.22208: variable 'ansible_shell_type' from source: unknown 41016 1727204186.22212: variable 'ansible_shell_executable' from source: unknown 41016 1727204186.22214: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204186.22215: variable 'ansible_pipelining' from source: unknown 41016 1727204186.22217: variable 'ansible_timeout' from source: unknown 41016 1727204186.22219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204186.22223: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204186.22225: variable 'omit' from source: magic vars 41016 1727204186.22227: starting attempt loop 41016 1727204186.22230: running the handler 41016 1727204186.22366: variable 'interface_stat' from source: set_fact 41016 1727204186.22391: Evaluated conditional (interface_stat.stat.exists): True 41016 1727204186.22405: handler run complete 41016 1727204186.22425: attempt loop complete, returning result 41016 1727204186.22432: _execute() done 41016 1727204186.22438: dumping result to json 41016 1727204186.22444: done dumping result, returning 41016 1727204186.22453: done running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'ethtest0' [028d2410-947f-12d5-0ec4-000000000215] 41016 1727204186.22462: sending task result for task 028d2410-947f-12d5-0ec4-000000000215 ok: [managed-node1] => { "changed": false } MSG: All assertions passed 41016 1727204186.22604: no more pending results, returning what we have 41016 1727204186.22611: results queue empty 41016 1727204186.22612: checking for any_errors_fatal 41016 1727204186.22618: done checking for any_errors_fatal 41016 1727204186.22619: checking for max_fail_percentage 41016 1727204186.22621: done checking for max_fail_percentage 41016 1727204186.22622: checking to see if all hosts have failed and the running result is not ok 41016 1727204186.22622: done checking to see if all hosts have failed 41016 1727204186.22624: getting the remaining hosts for this loop 41016 1727204186.22625: done getting the remaining hosts for this loop 41016 1727204186.22629: getting the next task for host managed-node1 41016 1727204186.22636: done getting next task for host managed-node1 41016 1727204186.22639: ^ task is: TASK: Set interface1 41016 1727204186.22641: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204186.22644: getting variables 41016 1727204186.22646: in VariableManager get_vars() 41016 1727204186.22691: Calling all_inventory to load vars for managed-node1 41016 1727204186.22695: Calling groups_inventory to load vars for managed-node1 41016 1727204186.22697: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204186.22712: Calling all_plugins_play to load vars for managed-node1 41016 1727204186.22715: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204186.22718: Calling groups_plugins_play to load vars for managed-node1 41016 1727204186.23053: done sending task result for task 028d2410-947f-12d5-0ec4-000000000215 41016 1727204186.23056: WORKER PROCESS EXITING 41016 1727204186.23078: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204186.23223: done with get_vars() 41016 1727204186.23232: done getting variables 41016 1727204186.23277: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set interface1] ********************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:23 Tuesday 24 September 2024 14:56:26 -0400 (0:00:00.028) 0:00:09.909 ***** 41016 1727204186.23295: entering _queue_task() for managed-node1/set_fact 41016 1727204186.23497: worker is 1 (out of 1 available) 41016 1727204186.23513: exiting _queue_task() for managed-node1/set_fact 41016 1727204186.23523: done queuing things up, now waiting for results queue to drain 41016 1727204186.23524: waiting for pending results... 41016 1727204186.23675: running TaskExecutor() for managed-node1/TASK: Set interface1 41016 1727204186.23730: in run() - task 028d2410-947f-12d5-0ec4-00000000000f 41016 1727204186.23742: variable 'ansible_search_path' from source: unknown 41016 1727204186.23773: calling self._execute() 41016 1727204186.23834: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204186.23838: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204186.23846: variable 'omit' from source: magic vars 41016 1727204186.24114: variable 'ansible_distribution_major_version' from source: facts 41016 1727204186.24121: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204186.24127: variable 'omit' from source: magic vars 41016 1727204186.24146: variable 'omit' from source: magic vars 41016 1727204186.24168: variable 'interface1' from source: play vars 41016 1727204186.24227: variable 'interface1' from source: play vars 41016 1727204186.24240: variable 'omit' from source: magic vars 41016 1727204186.24272: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204186.24307: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204186.24319: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204186.24332: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204186.24341: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204186.24363: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204186.24366: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204186.24369: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204186.24442: Set connection var ansible_shell_executable to /bin/sh 41016 1727204186.24445: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204186.24451: Set connection var ansible_shell_type to sh 41016 1727204186.24457: Set connection var ansible_timeout to 10 41016 1727204186.24461: Set connection var ansible_pipelining to False 41016 1727204186.24468: Set connection var ansible_connection to ssh 41016 1727204186.24486: variable 'ansible_shell_executable' from source: unknown 41016 1727204186.24489: variable 'ansible_connection' from source: unknown 41016 1727204186.24492: variable 'ansible_module_compression' from source: unknown 41016 1727204186.24494: variable 'ansible_shell_type' from source: unknown 41016 1727204186.24496: variable 'ansible_shell_executable' from source: unknown 41016 1727204186.24499: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204186.24503: variable 'ansible_pipelining' from source: unknown 41016 1727204186.24506: variable 'ansible_timeout' from source: unknown 41016 1727204186.24516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204186.24611: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204186.24617: variable 'omit' from source: magic vars 41016 1727204186.24625: starting attempt loop 41016 1727204186.24630: running the handler 41016 1727204186.24640: handler run complete 41016 1727204186.24646: attempt loop complete, returning result 41016 1727204186.24649: _execute() done 41016 1727204186.24652: dumping result to json 41016 1727204186.24656: done dumping result, returning 41016 1727204186.24662: done running TaskExecutor() for managed-node1/TASK: Set interface1 [028d2410-947f-12d5-0ec4-00000000000f] 41016 1727204186.24666: sending task result for task 028d2410-947f-12d5-0ec4-00000000000f 41016 1727204186.24757: done sending task result for task 028d2410-947f-12d5-0ec4-00000000000f 41016 1727204186.24760: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "interface": "ethtest1" }, "changed": false } 41016 1727204186.24832: no more pending results, returning what we have 41016 1727204186.24836: results queue empty 41016 1727204186.24837: checking for any_errors_fatal 41016 1727204186.24844: done checking for any_errors_fatal 41016 1727204186.24844: checking for max_fail_percentage 41016 1727204186.24846: done checking for max_fail_percentage 41016 1727204186.24848: checking to see if all hosts have failed and the running result is not ok 41016 1727204186.24848: done checking to see if all hosts have failed 41016 1727204186.24849: getting the remaining hosts for this loop 41016 1727204186.24850: done getting the remaining hosts for this loop 41016 1727204186.24854: getting the next task for host managed-node1 41016 1727204186.24858: done getting next task for host managed-node1 41016 1727204186.24861: ^ task is: TASK: Show interfaces 41016 1727204186.24863: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204186.24865: getting variables 41016 1727204186.24866: in VariableManager get_vars() 41016 1727204186.24904: Calling all_inventory to load vars for managed-node1 41016 1727204186.24906: Calling groups_inventory to load vars for managed-node1 41016 1727204186.24910: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204186.24919: Calling all_plugins_play to load vars for managed-node1 41016 1727204186.24921: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204186.24923: Calling groups_plugins_play to load vars for managed-node1 41016 1727204186.25095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204186.25730: done with get_vars() 41016 1727204186.25741: done getting variables TASK [Show interfaces] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:26 Tuesday 24 September 2024 14:56:26 -0400 (0:00:00.025) 0:00:09.934 ***** 41016 1727204186.25857: entering _queue_task() for managed-node1/include_tasks 41016 1727204186.26407: worker is 1 (out of 1 available) 41016 1727204186.26421: exiting _queue_task() for managed-node1/include_tasks 41016 1727204186.26432: done queuing things up, now waiting for results queue to drain 41016 1727204186.26433: waiting for pending results... 41016 1727204186.26600: running TaskExecutor() for managed-node1/TASK: Show interfaces 41016 1727204186.26662: in run() - task 028d2410-947f-12d5-0ec4-000000000010 41016 1727204186.26666: variable 'ansible_search_path' from source: unknown 41016 1727204186.26700: calling self._execute() 41016 1727204186.26767: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204186.26770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204186.26783: variable 'omit' from source: magic vars 41016 1727204186.27105: variable 'ansible_distribution_major_version' from source: facts 41016 1727204186.27116: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204186.27121: _execute() done 41016 1727204186.27126: dumping result to json 41016 1727204186.27129: done dumping result, returning 41016 1727204186.27136: done running TaskExecutor() for managed-node1/TASK: Show interfaces [028d2410-947f-12d5-0ec4-000000000010] 41016 1727204186.27140: sending task result for task 028d2410-947f-12d5-0ec4-000000000010 41016 1727204186.27224: done sending task result for task 028d2410-947f-12d5-0ec4-000000000010 41016 1727204186.27227: WORKER PROCESS EXITING 41016 1727204186.27255: no more pending results, returning what we have 41016 1727204186.27261: in VariableManager get_vars() 41016 1727204186.27308: Calling all_inventory to load vars for managed-node1 41016 1727204186.27311: Calling groups_inventory to load vars for managed-node1 41016 1727204186.27313: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204186.27323: Calling all_plugins_play to load vars for managed-node1 41016 1727204186.27326: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204186.27329: Calling groups_plugins_play to load vars for managed-node1 41016 1727204186.27493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204186.27607: done with get_vars() 41016 1727204186.27613: variable 'ansible_search_path' from source: unknown 41016 1727204186.27622: we have included files to process 41016 1727204186.27622: generating all_blocks data 41016 1727204186.27623: done generating all_blocks data 41016 1727204186.27628: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 41016 1727204186.27629: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 41016 1727204186.27631: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 41016 1727204186.27697: in VariableManager get_vars() 41016 1727204186.27712: done with get_vars() 41016 1727204186.27785: done processing included file 41016 1727204186.27787: iterating over new_blocks loaded from include file 41016 1727204186.27788: in VariableManager get_vars() 41016 1727204186.27799: done with get_vars() 41016 1727204186.27800: filtering new block on tags 41016 1727204186.27810: done filtering new block on tags 41016 1727204186.27812: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node1 41016 1727204186.27815: extending task lists for all hosts with included blocks 41016 1727204186.28222: done extending task lists 41016 1727204186.28223: done processing included files 41016 1727204186.28224: results queue empty 41016 1727204186.28224: checking for any_errors_fatal 41016 1727204186.28226: done checking for any_errors_fatal 41016 1727204186.28226: checking for max_fail_percentage 41016 1727204186.28227: done checking for max_fail_percentage 41016 1727204186.28228: checking to see if all hosts have failed and the running result is not ok 41016 1727204186.28228: done checking to see if all hosts have failed 41016 1727204186.28229: getting the remaining hosts for this loop 41016 1727204186.28229: done getting the remaining hosts for this loop 41016 1727204186.28231: getting the next task for host managed-node1 41016 1727204186.28233: done getting next task for host managed-node1 41016 1727204186.28235: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 41016 1727204186.28236: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204186.28237: getting variables 41016 1727204186.28238: in VariableManager get_vars() 41016 1727204186.28246: Calling all_inventory to load vars for managed-node1 41016 1727204186.28248: Calling groups_inventory to load vars for managed-node1 41016 1727204186.28249: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204186.28252: Calling all_plugins_play to load vars for managed-node1 41016 1727204186.28254: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204186.28255: Calling groups_plugins_play to load vars for managed-node1 41016 1727204186.28356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204186.28491: done with get_vars() 41016 1727204186.28500: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 14:56:26 -0400 (0:00:00.027) 0:00:09.961 ***** 41016 1727204186.28563: entering _queue_task() for managed-node1/include_tasks 41016 1727204186.28804: worker is 1 (out of 1 available) 41016 1727204186.28816: exiting _queue_task() for managed-node1/include_tasks 41016 1727204186.28828: done queuing things up, now waiting for results queue to drain 41016 1727204186.28829: waiting for pending results... 41016 1727204186.29192: running TaskExecutor() for managed-node1/TASK: Include the task 'get_current_interfaces.yml' 41016 1727204186.29198: in run() - task 028d2410-947f-12d5-0ec4-000000000282 41016 1727204186.29201: variable 'ansible_search_path' from source: unknown 41016 1727204186.29203: variable 'ansible_search_path' from source: unknown 41016 1727204186.29230: calling self._execute() 41016 1727204186.29319: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204186.29330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204186.29343: variable 'omit' from source: magic vars 41016 1727204186.29690: variable 'ansible_distribution_major_version' from source: facts 41016 1727204186.29707: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204186.29718: _execute() done 41016 1727204186.29726: dumping result to json 41016 1727204186.29754: done dumping result, returning 41016 1727204186.29757: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_current_interfaces.yml' [028d2410-947f-12d5-0ec4-000000000282] 41016 1727204186.29759: sending task result for task 028d2410-947f-12d5-0ec4-000000000282 41016 1727204186.29906: no more pending results, returning what we have 41016 1727204186.29912: in VariableManager get_vars() 41016 1727204186.29958: Calling all_inventory to load vars for managed-node1 41016 1727204186.29961: Calling groups_inventory to load vars for managed-node1 41016 1727204186.29964: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204186.29980: Calling all_plugins_play to load vars for managed-node1 41016 1727204186.29983: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204186.29986: Calling groups_plugins_play to load vars for managed-node1 41016 1727204186.30330: done sending task result for task 028d2410-947f-12d5-0ec4-000000000282 41016 1727204186.30333: WORKER PROCESS EXITING 41016 1727204186.30352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204186.30542: done with get_vars() 41016 1727204186.30550: variable 'ansible_search_path' from source: unknown 41016 1727204186.30551: variable 'ansible_search_path' from source: unknown 41016 1727204186.30589: we have included files to process 41016 1727204186.30590: generating all_blocks data 41016 1727204186.30591: done generating all_blocks data 41016 1727204186.30592: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 41016 1727204186.30593: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 41016 1727204186.30596: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 41016 1727204186.30854: done processing included file 41016 1727204186.30856: iterating over new_blocks loaded from include file 41016 1727204186.30857: in VariableManager get_vars() 41016 1727204186.30907: done with get_vars() 41016 1727204186.30909: filtering new block on tags 41016 1727204186.30926: done filtering new block on tags 41016 1727204186.30929: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node1 41016 1727204186.30934: extending task lists for all hosts with included blocks 41016 1727204186.31033: done extending task lists 41016 1727204186.31034: done processing included files 41016 1727204186.31035: results queue empty 41016 1727204186.31036: checking for any_errors_fatal 41016 1727204186.31038: done checking for any_errors_fatal 41016 1727204186.31039: checking for max_fail_percentage 41016 1727204186.31040: done checking for max_fail_percentage 41016 1727204186.31041: checking to see if all hosts have failed and the running result is not ok 41016 1727204186.31042: done checking to see if all hosts have failed 41016 1727204186.31042: getting the remaining hosts for this loop 41016 1727204186.31043: done getting the remaining hosts for this loop 41016 1727204186.31046: getting the next task for host managed-node1 41016 1727204186.31050: done getting next task for host managed-node1 41016 1727204186.31052: ^ task is: TASK: Gather current interface info 41016 1727204186.31055: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204186.31058: getting variables 41016 1727204186.31059: in VariableManager get_vars() 41016 1727204186.31071: Calling all_inventory to load vars for managed-node1 41016 1727204186.31074: Calling groups_inventory to load vars for managed-node1 41016 1727204186.31079: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204186.31084: Calling all_plugins_play to load vars for managed-node1 41016 1727204186.31086: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204186.31089: Calling groups_plugins_play to load vars for managed-node1 41016 1727204186.31222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204186.31403: done with get_vars() 41016 1727204186.31412: done getting variables 41016 1727204186.31449: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 14:56:26 -0400 (0:00:00.029) 0:00:09.990 ***** 41016 1727204186.31480: entering _queue_task() for managed-node1/command 41016 1727204186.31756: worker is 1 (out of 1 available) 41016 1727204186.31768: exiting _queue_task() for managed-node1/command 41016 1727204186.31884: done queuing things up, now waiting for results queue to drain 41016 1727204186.31886: waiting for pending results... 41016 1727204186.32050: running TaskExecutor() for managed-node1/TASK: Gather current interface info 41016 1727204186.32157: in run() - task 028d2410-947f-12d5-0ec4-0000000002e0 41016 1727204186.32178: variable 'ansible_search_path' from source: unknown 41016 1727204186.32187: variable 'ansible_search_path' from source: unknown 41016 1727204186.32229: calling self._execute() 41016 1727204186.32316: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204186.32332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204186.32348: variable 'omit' from source: magic vars 41016 1727204186.32713: variable 'ansible_distribution_major_version' from source: facts 41016 1727204186.32727: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204186.32736: variable 'omit' from source: magic vars 41016 1727204186.32783: variable 'omit' from source: magic vars 41016 1727204186.32818: variable 'omit' from source: magic vars 41016 1727204186.32857: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204186.32901: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204186.32925: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204186.32947: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204186.32963: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204186.33000: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204186.33012: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204186.33020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204186.33126: Set connection var ansible_shell_executable to /bin/sh 41016 1727204186.33137: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204186.33280: Set connection var ansible_shell_type to sh 41016 1727204186.33284: Set connection var ansible_timeout to 10 41016 1727204186.33286: Set connection var ansible_pipelining to False 41016 1727204186.33288: Set connection var ansible_connection to ssh 41016 1727204186.33290: variable 'ansible_shell_executable' from source: unknown 41016 1727204186.33292: variable 'ansible_connection' from source: unknown 41016 1727204186.33295: variable 'ansible_module_compression' from source: unknown 41016 1727204186.33297: variable 'ansible_shell_type' from source: unknown 41016 1727204186.33299: variable 'ansible_shell_executable' from source: unknown 41016 1727204186.33300: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204186.33302: variable 'ansible_pipelining' from source: unknown 41016 1727204186.33304: variable 'ansible_timeout' from source: unknown 41016 1727204186.33306: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204186.33364: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204186.33384: variable 'omit' from source: magic vars 41016 1727204186.33393: starting attempt loop 41016 1727204186.33400: running the handler 41016 1727204186.33422: _low_level_execute_command(): starting 41016 1727204186.33434: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204186.34145: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204186.34195: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204186.34214: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41016 1727204186.34306: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204186.34330: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204186.34348: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204186.34464: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204186.36255: stdout chunk (state=3): >>>/root <<< 41016 1727204186.36376: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204186.36392: stdout chunk (state=3): >>><<< 41016 1727204186.36406: stderr chunk (state=3): >>><<< 41016 1727204186.36536: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204186.36540: _low_level_execute_command(): starting 41016 1727204186.36543: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204186.3643599-42023-27395160043651 `" && echo ansible-tmp-1727204186.3643599-42023-27395160043651="` echo /root/.ansible/tmp/ansible-tmp-1727204186.3643599-42023-27395160043651 `" ) && sleep 0' 41016 1727204186.37117: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204186.37131: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204186.37148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204186.37167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204186.37194: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204186.37235: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204186.37249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204186.37292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204186.37345: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204186.37373: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204186.37414: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204186.37573: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204186.39672: stdout chunk (state=3): >>>ansible-tmp-1727204186.3643599-42023-27395160043651=/root/.ansible/tmp/ansible-tmp-1727204186.3643599-42023-27395160043651 <<< 41016 1727204186.39831: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204186.39846: stderr chunk (state=3): >>><<< 41016 1727204186.39857: stdout chunk (state=3): >>><<< 41016 1727204186.40081: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204186.3643599-42023-27395160043651=/root/.ansible/tmp/ansible-tmp-1727204186.3643599-42023-27395160043651 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204186.40085: variable 'ansible_module_compression' from source: unknown 41016 1727204186.40087: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-410168h8uvyln/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41016 1727204186.40089: variable 'ansible_facts' from source: unknown 41016 1727204186.40122: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204186.3643599-42023-27395160043651/AnsiballZ_command.py 41016 1727204186.40341: Sending initial data 41016 1727204186.40344: Sent initial data (155 bytes) 41016 1727204186.40998: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204186.41105: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204186.41139: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204186.41156: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204186.41183: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204186.41308: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204186.43097: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204186.43213: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204186.43385: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmpp6qr5v93 /root/.ansible/tmp/ansible-tmp-1727204186.3643599-42023-27395160043651/AnsiballZ_command.py <<< 41016 1727204186.43388: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204186.3643599-42023-27395160043651/AnsiballZ_command.py" <<< 41016 1727204186.43427: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmpp6qr5v93" to remote "/root/.ansible/tmp/ansible-tmp-1727204186.3643599-42023-27395160043651/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204186.3643599-42023-27395160043651/AnsiballZ_command.py" <<< 41016 1727204186.44447: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204186.44449: stderr chunk (state=3): >>><<< 41016 1727204186.44451: stdout chunk (state=3): >>><<< 41016 1727204186.44502: done transferring module to remote 41016 1727204186.44505: _low_level_execute_command(): starting 41016 1727204186.44510: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204186.3643599-42023-27395160043651/ /root/.ansible/tmp/ansible-tmp-1727204186.3643599-42023-27395160043651/AnsiballZ_command.py && sleep 0' 41016 1727204186.45883: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204186.45887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204186.45986: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204186.45992: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204186.46394: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204186.46499: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204186.48642: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204186.48646: stdout chunk (state=3): >>><<< 41016 1727204186.48649: stderr chunk (state=3): >>><<< 41016 1727204186.48651: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204186.48654: _low_level_execute_command(): starting 41016 1727204186.48656: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204186.3643599-42023-27395160043651/AnsiballZ_command.py && sleep 0' 41016 1727204186.49268: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204186.49295: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204186.49298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204186.49304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204186.49403: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204186.49406: stderr chunk (state=3): >>>debug2: match not found <<< 41016 1727204186.49411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204186.49413: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41016 1727204186.49415: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 41016 1727204186.49417: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41016 1727204186.49419: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204186.49421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204186.49422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204186.49425: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204186.49427: stderr chunk (state=3): >>>debug2: match found <<< 41016 1727204186.49452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204186.49488: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204186.49508: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204186.49518: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204186.49633: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204186.66509: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nethtest0\nlo\npeerethtest0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:56:26.659606", "end": "2024-09-24 14:56:26.663068", "delta": "0:00:00.003462", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41016 1727204186.68541: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 41016 1727204186.68544: stdout chunk (state=3): >>><<< 41016 1727204186.68546: stderr chunk (state=3): >>><<< 41016 1727204186.68565: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nethtest0\nlo\npeerethtest0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:56:26.659606", "end": "2024-09-24 14:56:26.663068", "delta": "0:00:00.003462", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204186.68842: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204186.3643599-42023-27395160043651/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204186.68845: _low_level_execute_command(): starting 41016 1727204186.68848: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204186.3643599-42023-27395160043651/ > /dev/null 2>&1 && sleep 0' 41016 1727204186.70281: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204186.70486: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204186.70490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204186.70492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204186.70494: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204186.70496: stderr chunk (state=3): >>>debug2: match not found <<< 41016 1727204186.70497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204186.70499: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41016 1727204186.70591: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204186.70708: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204186.70893: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204186.71030: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204186.73068: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204186.73073: stdout chunk (state=3): >>><<< 41016 1727204186.73081: stderr chunk (state=3): >>><<< 41016 1727204186.73099: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204186.73107: handler run complete 41016 1727204186.73134: Evaluated conditional (False): False 41016 1727204186.73142: attempt loop complete, returning result 41016 1727204186.73144: _execute() done 41016 1727204186.73147: dumping result to json 41016 1727204186.73154: done dumping result, returning 41016 1727204186.73162: done running TaskExecutor() for managed-node1/TASK: Gather current interface info [028d2410-947f-12d5-0ec4-0000000002e0] 41016 1727204186.73166: sending task result for task 028d2410-947f-12d5-0ec4-0000000002e0 41016 1727204186.73383: done sending task result for task 028d2410-947f-12d5-0ec4-0000000002e0 41016 1727204186.73387: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003462", "end": "2024-09-24 14:56:26.663068", "rc": 0, "start": "2024-09-24 14:56:26.659606" } STDOUT: bonding_masters eth0 ethtest0 lo peerethtest0 41016 1727204186.73464: no more pending results, returning what we have 41016 1727204186.73468: results queue empty 41016 1727204186.73469: checking for any_errors_fatal 41016 1727204186.73470: done checking for any_errors_fatal 41016 1727204186.73471: checking for max_fail_percentage 41016 1727204186.73472: done checking for max_fail_percentage 41016 1727204186.73474: checking to see if all hosts have failed and the running result is not ok 41016 1727204186.73477: done checking to see if all hosts have failed 41016 1727204186.73478: getting the remaining hosts for this loop 41016 1727204186.73479: done getting the remaining hosts for this loop 41016 1727204186.73483: getting the next task for host managed-node1 41016 1727204186.73491: done getting next task for host managed-node1 41016 1727204186.73493: ^ task is: TASK: Set current_interfaces 41016 1727204186.73502: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204186.73507: getting variables 41016 1727204186.73511: in VariableManager get_vars() 41016 1727204186.73556: Calling all_inventory to load vars for managed-node1 41016 1727204186.73559: Calling groups_inventory to load vars for managed-node1 41016 1727204186.73562: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204186.73572: Calling all_plugins_play to load vars for managed-node1 41016 1727204186.73780: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204186.73787: Calling groups_plugins_play to load vars for managed-node1 41016 1727204186.74136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204186.74528: done with get_vars() 41016 1727204186.74651: done getting variables 41016 1727204186.74720: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 14:56:26 -0400 (0:00:00.432) 0:00:10.423 ***** 41016 1727204186.74759: entering _queue_task() for managed-node1/set_fact 41016 1727204186.75139: worker is 1 (out of 1 available) 41016 1727204186.75161: exiting _queue_task() for managed-node1/set_fact 41016 1727204186.75184: done queuing things up, now waiting for results queue to drain 41016 1727204186.75185: waiting for pending results... 41016 1727204186.75539: running TaskExecutor() for managed-node1/TASK: Set current_interfaces 41016 1727204186.75562: in run() - task 028d2410-947f-12d5-0ec4-0000000002e1 41016 1727204186.75574: variable 'ansible_search_path' from source: unknown 41016 1727204186.75579: variable 'ansible_search_path' from source: unknown 41016 1727204186.75615: calling self._execute() 41016 1727204186.75704: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204186.75708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204186.75718: variable 'omit' from source: magic vars 41016 1727204186.76074: variable 'ansible_distribution_major_version' from source: facts 41016 1727204186.76081: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204186.76083: variable 'omit' from source: magic vars 41016 1727204186.76183: variable 'omit' from source: magic vars 41016 1727204186.76226: variable '_current_interfaces' from source: set_fact 41016 1727204186.76291: variable 'omit' from source: magic vars 41016 1727204186.76325: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204186.76359: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204186.76389: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204186.76481: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204186.76484: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204186.76486: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204186.76488: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204186.76490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204186.76577: Set connection var ansible_shell_executable to /bin/sh 41016 1727204186.76584: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204186.76590: Set connection var ansible_shell_type to sh 41016 1727204186.76596: Set connection var ansible_timeout to 10 41016 1727204186.76603: Set connection var ansible_pipelining to False 41016 1727204186.76614: Set connection var ansible_connection to ssh 41016 1727204186.76633: variable 'ansible_shell_executable' from source: unknown 41016 1727204186.76637: variable 'ansible_connection' from source: unknown 41016 1727204186.76639: variable 'ansible_module_compression' from source: unknown 41016 1727204186.76642: variable 'ansible_shell_type' from source: unknown 41016 1727204186.76644: variable 'ansible_shell_executable' from source: unknown 41016 1727204186.76646: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204186.76651: variable 'ansible_pipelining' from source: unknown 41016 1727204186.76653: variable 'ansible_timeout' from source: unknown 41016 1727204186.76657: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204186.76997: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204186.77050: variable 'omit' from source: magic vars 41016 1727204186.77053: starting attempt loop 41016 1727204186.77055: running the handler 41016 1727204186.77058: handler run complete 41016 1727204186.77060: attempt loop complete, returning result 41016 1727204186.77062: _execute() done 41016 1727204186.77064: dumping result to json 41016 1727204186.77066: done dumping result, returning 41016 1727204186.77068: done running TaskExecutor() for managed-node1/TASK: Set current_interfaces [028d2410-947f-12d5-0ec4-0000000002e1] 41016 1727204186.77070: sending task result for task 028d2410-947f-12d5-0ec4-0000000002e1 41016 1727204186.77314: done sending task result for task 028d2410-947f-12d5-0ec4-0000000002e1 41016 1727204186.77318: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "ethtest0", "lo", "peerethtest0" ] }, "changed": false } 41016 1727204186.77367: no more pending results, returning what we have 41016 1727204186.77370: results queue empty 41016 1727204186.77371: checking for any_errors_fatal 41016 1727204186.77380: done checking for any_errors_fatal 41016 1727204186.77381: checking for max_fail_percentage 41016 1727204186.77382: done checking for max_fail_percentage 41016 1727204186.77383: checking to see if all hosts have failed and the running result is not ok 41016 1727204186.77384: done checking to see if all hosts have failed 41016 1727204186.77384: getting the remaining hosts for this loop 41016 1727204186.77386: done getting the remaining hosts for this loop 41016 1727204186.77389: getting the next task for host managed-node1 41016 1727204186.77396: done getting next task for host managed-node1 41016 1727204186.77398: ^ task is: TASK: Show current_interfaces 41016 1727204186.77400: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204186.77403: getting variables 41016 1727204186.77404: in VariableManager get_vars() 41016 1727204186.77441: Calling all_inventory to load vars for managed-node1 41016 1727204186.77444: Calling groups_inventory to load vars for managed-node1 41016 1727204186.77447: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204186.77455: Calling all_plugins_play to load vars for managed-node1 41016 1727204186.77458: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204186.77461: Calling groups_plugins_play to load vars for managed-node1 41016 1727204186.77640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204186.77844: done with get_vars() 41016 1727204186.77853: done getting variables 41016 1727204186.77905: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 14:56:26 -0400 (0:00:00.031) 0:00:10.455 ***** 41016 1727204186.77940: entering _queue_task() for managed-node1/debug 41016 1727204186.78195: worker is 1 (out of 1 available) 41016 1727204186.78206: exiting _queue_task() for managed-node1/debug 41016 1727204186.78217: done queuing things up, now waiting for results queue to drain 41016 1727204186.78219: waiting for pending results... 41016 1727204186.78586: running TaskExecutor() for managed-node1/TASK: Show current_interfaces 41016 1727204186.78592: in run() - task 028d2410-947f-12d5-0ec4-000000000283 41016 1727204186.78605: variable 'ansible_search_path' from source: unknown 41016 1727204186.78612: variable 'ansible_search_path' from source: unknown 41016 1727204186.78652: calling self._execute() 41016 1727204186.78745: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204186.78757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204186.78771: variable 'omit' from source: magic vars 41016 1727204186.79164: variable 'ansible_distribution_major_version' from source: facts 41016 1727204186.79184: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204186.79196: variable 'omit' from source: magic vars 41016 1727204186.79280: variable 'omit' from source: magic vars 41016 1727204186.79352: variable 'current_interfaces' from source: set_fact 41016 1727204186.79387: variable 'omit' from source: magic vars 41016 1727204186.79447: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204186.79581: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204186.79612: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204186.79642: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204186.79673: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204186.79734: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204186.79737: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204186.79739: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204186.79886: Set connection var ansible_shell_executable to /bin/sh 41016 1727204186.79888: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204186.79891: Set connection var ansible_shell_type to sh 41016 1727204186.79910: Set connection var ansible_timeout to 10 41016 1727204186.79927: Set connection var ansible_pipelining to False 41016 1727204186.79939: Set connection var ansible_connection to ssh 41016 1727204186.79963: variable 'ansible_shell_executable' from source: unknown 41016 1727204186.79971: variable 'ansible_connection' from source: unknown 41016 1727204186.79984: variable 'ansible_module_compression' from source: unknown 41016 1727204186.79996: variable 'ansible_shell_type' from source: unknown 41016 1727204186.80093: variable 'ansible_shell_executable' from source: unknown 41016 1727204186.80097: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204186.80100: variable 'ansible_pipelining' from source: unknown 41016 1727204186.80102: variable 'ansible_timeout' from source: unknown 41016 1727204186.80104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204186.80190: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204186.80219: variable 'omit' from source: magic vars 41016 1727204186.80288: starting attempt loop 41016 1727204186.80291: running the handler 41016 1727204186.80293: handler run complete 41016 1727204186.80314: attempt loop complete, returning result 41016 1727204186.80326: _execute() done 41016 1727204186.80333: dumping result to json 41016 1727204186.80341: done dumping result, returning 41016 1727204186.80427: done running TaskExecutor() for managed-node1/TASK: Show current_interfaces [028d2410-947f-12d5-0ec4-000000000283] 41016 1727204186.80432: sending task result for task 028d2410-947f-12d5-0ec4-000000000283 41016 1727204186.80499: done sending task result for task 028d2410-947f-12d5-0ec4-000000000283 41016 1727204186.80503: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'ethtest0', 'lo', 'peerethtest0'] 41016 1727204186.80577: no more pending results, returning what we have 41016 1727204186.80582: results queue empty 41016 1727204186.80584: checking for any_errors_fatal 41016 1727204186.80588: done checking for any_errors_fatal 41016 1727204186.80589: checking for max_fail_percentage 41016 1727204186.80591: done checking for max_fail_percentage 41016 1727204186.80592: checking to see if all hosts have failed and the running result is not ok 41016 1727204186.80593: done checking to see if all hosts have failed 41016 1727204186.80594: getting the remaining hosts for this loop 41016 1727204186.80595: done getting the remaining hosts for this loop 41016 1727204186.80599: getting the next task for host managed-node1 41016 1727204186.80608: done getting next task for host managed-node1 41016 1727204186.80612: ^ task is: TASK: Manage test interface 41016 1727204186.80649: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204186.80653: getting variables 41016 1727204186.80655: in VariableManager get_vars() 41016 1727204186.80761: Calling all_inventory to load vars for managed-node1 41016 1727204186.80764: Calling groups_inventory to load vars for managed-node1 41016 1727204186.80767: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204186.80779: Calling all_plugins_play to load vars for managed-node1 41016 1727204186.80782: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204186.80786: Calling groups_plugins_play to load vars for managed-node1 41016 1727204186.81206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204186.81446: done with get_vars() 41016 1727204186.81455: done getting variables TASK [Manage test interface] *************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:28 Tuesday 24 September 2024 14:56:26 -0400 (0:00:00.036) 0:00:10.491 ***** 41016 1727204186.81547: entering _queue_task() for managed-node1/include_tasks 41016 1727204186.81839: worker is 1 (out of 1 available) 41016 1727204186.81966: exiting _queue_task() for managed-node1/include_tasks 41016 1727204186.81979: done queuing things up, now waiting for results queue to drain 41016 1727204186.81981: waiting for pending results... 41016 1727204186.82256: running TaskExecutor() for managed-node1/TASK: Manage test interface 41016 1727204186.82302: in run() - task 028d2410-947f-12d5-0ec4-000000000011 41016 1727204186.82328: variable 'ansible_search_path' from source: unknown 41016 1727204186.82374: calling self._execute() 41016 1727204186.82507: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204186.82512: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204186.82514: variable 'omit' from source: magic vars 41016 1727204186.82941: variable 'ansible_distribution_major_version' from source: facts 41016 1727204186.82958: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204186.82969: _execute() done 41016 1727204186.82981: dumping result to json 41016 1727204186.82989: done dumping result, returning 41016 1727204186.83000: done running TaskExecutor() for managed-node1/TASK: Manage test interface [028d2410-947f-12d5-0ec4-000000000011] 41016 1727204186.83050: sending task result for task 028d2410-947f-12d5-0ec4-000000000011 41016 1727204186.83122: done sending task result for task 028d2410-947f-12d5-0ec4-000000000011 41016 1727204186.83125: WORKER PROCESS EXITING 41016 1727204186.83180: no more pending results, returning what we have 41016 1727204186.83187: in VariableManager get_vars() 41016 1727204186.83231: Calling all_inventory to load vars for managed-node1 41016 1727204186.83235: Calling groups_inventory to load vars for managed-node1 41016 1727204186.83238: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204186.83250: Calling all_plugins_play to load vars for managed-node1 41016 1727204186.83254: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204186.83257: Calling groups_plugins_play to load vars for managed-node1 41016 1727204186.83688: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204186.83913: done with get_vars() 41016 1727204186.83927: variable 'ansible_search_path' from source: unknown 41016 1727204186.83939: we have included files to process 41016 1727204186.83940: generating all_blocks data 41016 1727204186.83942: done generating all_blocks data 41016 1727204186.83948: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 41016 1727204186.83949: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 41016 1727204186.83952: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 41016 1727204186.84401: in VariableManager get_vars() 41016 1727204186.84422: done with get_vars() 41016 1727204186.85257: done processing included file 41016 1727204186.85259: iterating over new_blocks loaded from include file 41016 1727204186.85260: in VariableManager get_vars() 41016 1727204186.85280: done with get_vars() 41016 1727204186.85282: filtering new block on tags 41016 1727204186.85326: done filtering new block on tags 41016 1727204186.85335: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed-node1 41016 1727204186.85344: extending task lists for all hosts with included blocks 41016 1727204186.86726: done extending task lists 41016 1727204186.86728: done processing included files 41016 1727204186.86728: results queue empty 41016 1727204186.86729: checking for any_errors_fatal 41016 1727204186.86732: done checking for any_errors_fatal 41016 1727204186.86733: checking for max_fail_percentage 41016 1727204186.86734: done checking for max_fail_percentage 41016 1727204186.86734: checking to see if all hosts have failed and the running result is not ok 41016 1727204186.86735: done checking to see if all hosts have failed 41016 1727204186.86736: getting the remaining hosts for this loop 41016 1727204186.86737: done getting the remaining hosts for this loop 41016 1727204186.86739: getting the next task for host managed-node1 41016 1727204186.86779: done getting next task for host managed-node1 41016 1727204186.86782: ^ task is: TASK: Ensure state in ["present", "absent"] 41016 1727204186.86784: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204186.86786: getting variables 41016 1727204186.86787: in VariableManager get_vars() 41016 1727204186.86863: Calling all_inventory to load vars for managed-node1 41016 1727204186.86866: Calling groups_inventory to load vars for managed-node1 41016 1727204186.86868: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204186.86873: Calling all_plugins_play to load vars for managed-node1 41016 1727204186.86876: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204186.86879: Calling groups_plugins_play to load vars for managed-node1 41016 1727204186.87219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204186.87636: done with get_vars() 41016 1727204186.87645: done getting variables 41016 1727204186.87777: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Tuesday 24 September 2024 14:56:26 -0400 (0:00:00.062) 0:00:10.554 ***** 41016 1727204186.87801: entering _queue_task() for managed-node1/fail 41016 1727204186.88264: worker is 1 (out of 1 available) 41016 1727204186.88274: exiting _queue_task() for managed-node1/fail 41016 1727204186.88289: done queuing things up, now waiting for results queue to drain 41016 1727204186.88290: waiting for pending results... 41016 1727204186.88579: running TaskExecutor() for managed-node1/TASK: Ensure state in ["present", "absent"] 41016 1727204186.88589: in run() - task 028d2410-947f-12d5-0ec4-0000000002fc 41016 1727204186.88592: variable 'ansible_search_path' from source: unknown 41016 1727204186.88597: variable 'ansible_search_path' from source: unknown 41016 1727204186.88640: calling self._execute() 41016 1727204186.88726: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204186.88739: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204186.88756: variable 'omit' from source: magic vars 41016 1727204186.89133: variable 'ansible_distribution_major_version' from source: facts 41016 1727204186.89139: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204186.89362: variable 'state' from source: include params 41016 1727204186.89365: Evaluated conditional (state not in ["present", "absent"]): False 41016 1727204186.89367: when evaluation is False, skipping this task 41016 1727204186.89369: _execute() done 41016 1727204186.89371: dumping result to json 41016 1727204186.89380: done dumping result, returning 41016 1727204186.89398: done running TaskExecutor() for managed-node1/TASK: Ensure state in ["present", "absent"] [028d2410-947f-12d5-0ec4-0000000002fc] 41016 1727204186.89431: sending task result for task 028d2410-947f-12d5-0ec4-0000000002fc skipping: [managed-node1] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 41016 1727204186.89606: no more pending results, returning what we have 41016 1727204186.89698: results queue empty 41016 1727204186.89700: checking for any_errors_fatal 41016 1727204186.89701: done checking for any_errors_fatal 41016 1727204186.89702: checking for max_fail_percentage 41016 1727204186.89703: done checking for max_fail_percentage 41016 1727204186.89704: checking to see if all hosts have failed and the running result is not ok 41016 1727204186.89705: done checking to see if all hosts have failed 41016 1727204186.89705: getting the remaining hosts for this loop 41016 1727204186.89707: done getting the remaining hosts for this loop 41016 1727204186.89710: getting the next task for host managed-node1 41016 1727204186.89714: done getting next task for host managed-node1 41016 1727204186.89717: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 41016 1727204186.89744: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204186.89748: getting variables 41016 1727204186.89749: in VariableManager get_vars() 41016 1727204186.89787: Calling all_inventory to load vars for managed-node1 41016 1727204186.89790: Calling groups_inventory to load vars for managed-node1 41016 1727204186.89792: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204186.89801: Calling all_plugins_play to load vars for managed-node1 41016 1727204186.89804: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204186.89811: Calling groups_plugins_play to load vars for managed-node1 41016 1727204186.89980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204186.90191: done with get_vars() 41016 1727204186.90199: done getting variables 41016 1727204186.90247: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Tuesday 24 September 2024 14:56:26 -0400 (0:00:00.024) 0:00:10.578 ***** 41016 1727204186.90271: entering _queue_task() for managed-node1/fail 41016 1727204186.90516: worker is 1 (out of 1 available) 41016 1727204186.90527: exiting _queue_task() for managed-node1/fail 41016 1727204186.90538: done queuing things up, now waiting for results queue to drain 41016 1727204186.90540: waiting for pending results... 41016 1727204186.90572: done sending task result for task 028d2410-947f-12d5-0ec4-0000000002fc 41016 1727204186.90579: WORKER PROCESS EXITING 41016 1727204186.90772: running TaskExecutor() for managed-node1/TASK: Ensure type in ["dummy", "tap", "veth"] 41016 1727204186.90867: in run() - task 028d2410-947f-12d5-0ec4-0000000002fd 41016 1727204186.90896: variable 'ansible_search_path' from source: unknown 41016 1727204186.90904: variable 'ansible_search_path' from source: unknown 41016 1727204186.90944: calling self._execute() 41016 1727204186.91029: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204186.91041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204186.91054: variable 'omit' from source: magic vars 41016 1727204186.91443: variable 'ansible_distribution_major_version' from source: facts 41016 1727204186.91581: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204186.91616: variable 'type' from source: set_fact 41016 1727204186.91627: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 41016 1727204186.91635: when evaluation is False, skipping this task 41016 1727204186.91642: _execute() done 41016 1727204186.91649: dumping result to json 41016 1727204186.91657: done dumping result, returning 41016 1727204186.91667: done running TaskExecutor() for managed-node1/TASK: Ensure type in ["dummy", "tap", "veth"] [028d2410-947f-12d5-0ec4-0000000002fd] 41016 1727204186.91678: sending task result for task 028d2410-947f-12d5-0ec4-0000000002fd skipping: [managed-node1] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 41016 1727204186.91852: no more pending results, returning what we have 41016 1727204186.91856: results queue empty 41016 1727204186.91857: checking for any_errors_fatal 41016 1727204186.91862: done checking for any_errors_fatal 41016 1727204186.91863: checking for max_fail_percentage 41016 1727204186.91865: done checking for max_fail_percentage 41016 1727204186.91866: checking to see if all hosts have failed and the running result is not ok 41016 1727204186.91867: done checking to see if all hosts have failed 41016 1727204186.91868: getting the remaining hosts for this loop 41016 1727204186.91870: done getting the remaining hosts for this loop 41016 1727204186.91873: getting the next task for host managed-node1 41016 1727204186.91885: done getting next task for host managed-node1 41016 1727204186.91888: ^ task is: TASK: Include the task 'show_interfaces.yml' 41016 1727204186.91891: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204186.91896: getting variables 41016 1727204186.91898: in VariableManager get_vars() 41016 1727204186.91945: Calling all_inventory to load vars for managed-node1 41016 1727204186.91948: Calling groups_inventory to load vars for managed-node1 41016 1727204186.91951: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204186.91964: Calling all_plugins_play to load vars for managed-node1 41016 1727204186.91967: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204186.91971: Calling groups_plugins_play to load vars for managed-node1 41016 1727204186.92188: done sending task result for task 028d2410-947f-12d5-0ec4-0000000002fd 41016 1727204186.92191: WORKER PROCESS EXITING 41016 1727204186.92363: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204186.92697: done with get_vars() 41016 1727204186.92708: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Tuesday 24 September 2024 14:56:26 -0400 (0:00:00.025) 0:00:10.604 ***** 41016 1727204186.92795: entering _queue_task() for managed-node1/include_tasks 41016 1727204186.93236: worker is 1 (out of 1 available) 41016 1727204186.93249: exiting _queue_task() for managed-node1/include_tasks 41016 1727204186.93338: done queuing things up, now waiting for results queue to drain 41016 1727204186.93340: waiting for pending results... 41016 1727204186.93550: running TaskExecutor() for managed-node1/TASK: Include the task 'show_interfaces.yml' 41016 1727204186.93748: in run() - task 028d2410-947f-12d5-0ec4-0000000002fe 41016 1727204186.93760: variable 'ansible_search_path' from source: unknown 41016 1727204186.93764: variable 'ansible_search_path' from source: unknown 41016 1727204186.94020: calling self._execute() 41016 1727204186.94023: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204186.94117: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204186.94132: variable 'omit' from source: magic vars 41016 1727204186.94883: variable 'ansible_distribution_major_version' from source: facts 41016 1727204186.94886: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204186.94889: _execute() done 41016 1727204186.94891: dumping result to json 41016 1727204186.94894: done dumping result, returning 41016 1727204186.94896: done running TaskExecutor() for managed-node1/TASK: Include the task 'show_interfaces.yml' [028d2410-947f-12d5-0ec4-0000000002fe] 41016 1727204186.94898: sending task result for task 028d2410-947f-12d5-0ec4-0000000002fe 41016 1727204186.95131: done sending task result for task 028d2410-947f-12d5-0ec4-0000000002fe 41016 1727204186.95160: no more pending results, returning what we have 41016 1727204186.95166: in VariableManager get_vars() 41016 1727204186.95224: Calling all_inventory to load vars for managed-node1 41016 1727204186.95227: Calling groups_inventory to load vars for managed-node1 41016 1727204186.95230: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204186.95249: Calling all_plugins_play to load vars for managed-node1 41016 1727204186.95252: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204186.95255: Calling groups_plugins_play to load vars for managed-node1 41016 1727204186.95983: WORKER PROCESS EXITING 41016 1727204186.96597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204186.96815: done with get_vars() 41016 1727204186.96823: variable 'ansible_search_path' from source: unknown 41016 1727204186.96824: variable 'ansible_search_path' from source: unknown 41016 1727204186.96860: we have included files to process 41016 1727204186.96861: generating all_blocks data 41016 1727204186.96863: done generating all_blocks data 41016 1727204186.96865: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 41016 1727204186.96866: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 41016 1727204186.96868: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 41016 1727204186.97021: in VariableManager get_vars() 41016 1727204186.97044: done with get_vars() 41016 1727204186.97160: done processing included file 41016 1727204186.97162: iterating over new_blocks loaded from include file 41016 1727204186.97163: in VariableManager get_vars() 41016 1727204186.97515: done with get_vars() 41016 1727204186.97518: filtering new block on tags 41016 1727204186.97537: done filtering new block on tags 41016 1727204186.97539: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node1 41016 1727204186.97544: extending task lists for all hosts with included blocks 41016 1727204186.98470: done extending task lists 41016 1727204186.98472: done processing included files 41016 1727204186.98473: results queue empty 41016 1727204186.98474: checking for any_errors_fatal 41016 1727204186.98478: done checking for any_errors_fatal 41016 1727204186.98479: checking for max_fail_percentage 41016 1727204186.98480: done checking for max_fail_percentage 41016 1727204186.98481: checking to see if all hosts have failed and the running result is not ok 41016 1727204186.98482: done checking to see if all hosts have failed 41016 1727204186.98482: getting the remaining hosts for this loop 41016 1727204186.98484: done getting the remaining hosts for this loop 41016 1727204186.98486: getting the next task for host managed-node1 41016 1727204186.98491: done getting next task for host managed-node1 41016 1727204186.98580: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 41016 1727204186.98583: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204186.98587: getting variables 41016 1727204186.98588: in VariableManager get_vars() 41016 1727204186.98618: Calling all_inventory to load vars for managed-node1 41016 1727204186.98620: Calling groups_inventory to load vars for managed-node1 41016 1727204186.98622: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204186.98629: Calling all_plugins_play to load vars for managed-node1 41016 1727204186.98631: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204186.98634: Calling groups_plugins_play to load vars for managed-node1 41016 1727204186.98956: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204186.99357: done with get_vars() 41016 1727204186.99485: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 14:56:26 -0400 (0:00:00.067) 0:00:10.671 ***** 41016 1727204186.99568: entering _queue_task() for managed-node1/include_tasks 41016 1727204187.00298: worker is 1 (out of 1 available) 41016 1727204187.00311: exiting _queue_task() for managed-node1/include_tasks 41016 1727204187.00322: done queuing things up, now waiting for results queue to drain 41016 1727204187.00324: waiting for pending results... 41016 1727204187.00684: running TaskExecutor() for managed-node1/TASK: Include the task 'get_current_interfaces.yml' 41016 1727204187.01132: in run() - task 028d2410-947f-12d5-0ec4-000000000374 41016 1727204187.01143: variable 'ansible_search_path' from source: unknown 41016 1727204187.01147: variable 'ansible_search_path' from source: unknown 41016 1727204187.01180: calling self._execute() 41016 1727204187.01505: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204187.01523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204187.01532: variable 'omit' from source: magic vars 41016 1727204187.02645: variable 'ansible_distribution_major_version' from source: facts 41016 1727204187.02655: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204187.02663: _execute() done 41016 1727204187.02666: dumping result to json 41016 1727204187.02668: done dumping result, returning 41016 1727204187.02674: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_current_interfaces.yml' [028d2410-947f-12d5-0ec4-000000000374] 41016 1727204187.03102: sending task result for task 028d2410-947f-12d5-0ec4-000000000374 41016 1727204187.03188: done sending task result for task 028d2410-947f-12d5-0ec4-000000000374 41016 1727204187.03190: WORKER PROCESS EXITING 41016 1727204187.03251: no more pending results, returning what we have 41016 1727204187.03257: in VariableManager get_vars() 41016 1727204187.03314: Calling all_inventory to load vars for managed-node1 41016 1727204187.03317: Calling groups_inventory to load vars for managed-node1 41016 1727204187.03320: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204187.03335: Calling all_plugins_play to load vars for managed-node1 41016 1727204187.03339: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204187.03342: Calling groups_plugins_play to load vars for managed-node1 41016 1727204187.03647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204187.04519: done with get_vars() 41016 1727204187.04528: variable 'ansible_search_path' from source: unknown 41016 1727204187.04529: variable 'ansible_search_path' from source: unknown 41016 1727204187.04820: we have included files to process 41016 1727204187.04822: generating all_blocks data 41016 1727204187.04823: done generating all_blocks data 41016 1727204187.04825: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 41016 1727204187.04826: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 41016 1727204187.04829: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 41016 1727204187.05410: done processing included file 41016 1727204187.05412: iterating over new_blocks loaded from include file 41016 1727204187.05414: in VariableManager get_vars() 41016 1727204187.05439: done with get_vars() 41016 1727204187.05441: filtering new block on tags 41016 1727204187.05645: done filtering new block on tags 41016 1727204187.05648: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node1 41016 1727204187.05653: extending task lists for all hosts with included blocks 41016 1727204187.06029: done extending task lists 41016 1727204187.06031: done processing included files 41016 1727204187.06032: results queue empty 41016 1727204187.06032: checking for any_errors_fatal 41016 1727204187.06035: done checking for any_errors_fatal 41016 1727204187.06036: checking for max_fail_percentage 41016 1727204187.06037: done checking for max_fail_percentage 41016 1727204187.06038: checking to see if all hosts have failed and the running result is not ok 41016 1727204187.06039: done checking to see if all hosts have failed 41016 1727204187.06039: getting the remaining hosts for this loop 41016 1727204187.06040: done getting the remaining hosts for this loop 41016 1727204187.06043: getting the next task for host managed-node1 41016 1727204187.06048: done getting next task for host managed-node1 41016 1727204187.06050: ^ task is: TASK: Gather current interface info 41016 1727204187.06054: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204187.06056: getting variables 41016 1727204187.06057: in VariableManager get_vars() 41016 1727204187.06185: Calling all_inventory to load vars for managed-node1 41016 1727204187.06188: Calling groups_inventory to load vars for managed-node1 41016 1727204187.06190: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204187.06196: Calling all_plugins_play to load vars for managed-node1 41016 1727204187.06198: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204187.06201: Calling groups_plugins_play to load vars for managed-node1 41016 1727204187.06581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204187.06929: done with get_vars() 41016 1727204187.07003: done getting variables 41016 1727204187.07150: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 14:56:27 -0400 (0:00:00.076) 0:00:10.747 ***** 41016 1727204187.07187: entering _queue_task() for managed-node1/command 41016 1727204187.07512: worker is 1 (out of 1 available) 41016 1727204187.07525: exiting _queue_task() for managed-node1/command 41016 1727204187.07537: done queuing things up, now waiting for results queue to drain 41016 1727204187.07539: waiting for pending results... 41016 1727204187.07712: running TaskExecutor() for managed-node1/TASK: Gather current interface info 41016 1727204187.07781: in run() - task 028d2410-947f-12d5-0ec4-0000000003ab 41016 1727204187.07794: variable 'ansible_search_path' from source: unknown 41016 1727204187.07797: variable 'ansible_search_path' from source: unknown 41016 1727204187.07827: calling self._execute() 41016 1727204187.07888: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204187.07892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204187.07900: variable 'omit' from source: magic vars 41016 1727204187.08167: variable 'ansible_distribution_major_version' from source: facts 41016 1727204187.08178: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204187.08184: variable 'omit' from source: magic vars 41016 1727204187.08221: variable 'omit' from source: magic vars 41016 1727204187.08245: variable 'omit' from source: magic vars 41016 1727204187.08281: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204187.08307: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204187.08323: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204187.08337: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204187.08347: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204187.08372: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204187.08377: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204187.08380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204187.08448: Set connection var ansible_shell_executable to /bin/sh 41016 1727204187.08451: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204187.08457: Set connection var ansible_shell_type to sh 41016 1727204187.08462: Set connection var ansible_timeout to 10 41016 1727204187.08472: Set connection var ansible_pipelining to False 41016 1727204187.08474: Set connection var ansible_connection to ssh 41016 1727204187.08491: variable 'ansible_shell_executable' from source: unknown 41016 1727204187.08493: variable 'ansible_connection' from source: unknown 41016 1727204187.08496: variable 'ansible_module_compression' from source: unknown 41016 1727204187.08499: variable 'ansible_shell_type' from source: unknown 41016 1727204187.08501: variable 'ansible_shell_executable' from source: unknown 41016 1727204187.08503: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204187.08507: variable 'ansible_pipelining' from source: unknown 41016 1727204187.08512: variable 'ansible_timeout' from source: unknown 41016 1727204187.08515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204187.08615: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204187.08621: variable 'omit' from source: magic vars 41016 1727204187.08628: starting attempt loop 41016 1727204187.08631: running the handler 41016 1727204187.08644: _low_level_execute_command(): starting 41016 1727204187.08651: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204187.09319: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204187.09699: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204187.09717: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204187.11487: stdout chunk (state=3): >>>/root <<< 41016 1727204187.11595: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204187.11617: stderr chunk (state=3): >>><<< 41016 1727204187.11620: stdout chunk (state=3): >>><<< 41016 1727204187.11640: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204187.11653: _low_level_execute_command(): starting 41016 1727204187.11662: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204187.1164083-42125-160015459130300 `" && echo ansible-tmp-1727204187.1164083-42125-160015459130300="` echo /root/.ansible/tmp/ansible-tmp-1727204187.1164083-42125-160015459130300 `" ) && sleep 0' 41016 1727204187.12182: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204187.12203: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204187.12334: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204187.14445: stdout chunk (state=3): >>>ansible-tmp-1727204187.1164083-42125-160015459130300=/root/.ansible/tmp/ansible-tmp-1727204187.1164083-42125-160015459130300 <<< 41016 1727204187.14594: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204187.14625: stderr chunk (state=3): >>><<< 41016 1727204187.14633: stdout chunk (state=3): >>><<< 41016 1727204187.14656: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204187.1164083-42125-160015459130300=/root/.ansible/tmp/ansible-tmp-1727204187.1164083-42125-160015459130300 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204187.14694: variable 'ansible_module_compression' from source: unknown 41016 1727204187.14763: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-410168h8uvyln/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41016 1727204187.14805: variable 'ansible_facts' from source: unknown 41016 1727204187.14913: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204187.1164083-42125-160015459130300/AnsiballZ_command.py 41016 1727204187.15102: Sending initial data 41016 1727204187.15106: Sent initial data (156 bytes) 41016 1727204187.15799: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204187.15859: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204187.15879: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204187.15913: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204187.16032: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204187.17789: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204187.17889: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204187.17999: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmpyejsheh4 /root/.ansible/tmp/ansible-tmp-1727204187.1164083-42125-160015459130300/AnsiballZ_command.py <<< 41016 1727204187.18015: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204187.1164083-42125-160015459130300/AnsiballZ_command.py" <<< 41016 1727204187.18075: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmpyejsheh4" to remote "/root/.ansible/tmp/ansible-tmp-1727204187.1164083-42125-160015459130300/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204187.1164083-42125-160015459130300/AnsiballZ_command.py" <<< 41016 1727204187.19087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204187.19090: stderr chunk (state=3): >>><<< 41016 1727204187.19093: stdout chunk (state=3): >>><<< 41016 1727204187.19098: done transferring module to remote 41016 1727204187.19119: _low_level_execute_command(): starting 41016 1727204187.19129: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204187.1164083-42125-160015459130300/ /root/.ansible/tmp/ansible-tmp-1727204187.1164083-42125-160015459130300/AnsiballZ_command.py && sleep 0' 41016 1727204187.19800: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204187.19818: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204187.19832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204187.19860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204187.19969: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204187.19997: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204187.20115: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204187.22142: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204187.22146: stdout chunk (state=3): >>><<< 41016 1727204187.22150: stderr chunk (state=3): >>><<< 41016 1727204187.22172: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204187.22270: _low_level_execute_command(): starting 41016 1727204187.22274: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204187.1164083-42125-160015459130300/AnsiballZ_command.py && sleep 0' 41016 1727204187.22853: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204187.22868: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204187.22885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204187.22901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204187.22929: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 41016 1727204187.22993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204187.23046: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204187.23063: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204187.23086: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204187.23202: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204187.39989: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nethtest0\nlo\npeerethtest0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:56:27.394750", "end": "2024-09-24 14:56:27.398220", "delta": "0:00:00.003470", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41016 1727204187.41753: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 41016 1727204187.41757: stdout chunk (state=3): >>><<< 41016 1727204187.41759: stderr chunk (state=3): >>><<< 41016 1727204187.41778: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nethtest0\nlo\npeerethtest0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:56:27.394750", "end": "2024-09-24 14:56:27.398220", "delta": "0:00:00.003470", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204187.41829: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204187.1164083-42125-160015459130300/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204187.41864: _low_level_execute_command(): starting 41016 1727204187.41868: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204187.1164083-42125-160015459130300/ > /dev/null 2>&1 && sleep 0' 41016 1727204187.42379: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204187.42394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204187.42413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204187.42462: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204187.42468: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204187.42470: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204187.42548: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204187.44718: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204187.44737: stderr chunk (state=3): >>><<< 41016 1727204187.44740: stdout chunk (state=3): >>><<< 41016 1727204187.44757: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204187.44764: handler run complete 41016 1727204187.44825: Evaluated conditional (False): False 41016 1727204187.44828: attempt loop complete, returning result 41016 1727204187.44831: _execute() done 41016 1727204187.44833: dumping result to json 41016 1727204187.44835: done dumping result, returning 41016 1727204187.44837: done running TaskExecutor() for managed-node1/TASK: Gather current interface info [028d2410-947f-12d5-0ec4-0000000003ab] 41016 1727204187.44839: sending task result for task 028d2410-947f-12d5-0ec4-0000000003ab 41016 1727204187.44919: done sending task result for task 028d2410-947f-12d5-0ec4-0000000003ab 41016 1727204187.44922: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003470", "end": "2024-09-24 14:56:27.398220", "rc": 0, "start": "2024-09-24 14:56:27.394750" } STDOUT: bonding_masters eth0 ethtest0 lo peerethtest0 41016 1727204187.44998: no more pending results, returning what we have 41016 1727204187.45002: results queue empty 41016 1727204187.45003: checking for any_errors_fatal 41016 1727204187.45004: done checking for any_errors_fatal 41016 1727204187.45005: checking for max_fail_percentage 41016 1727204187.45006: done checking for max_fail_percentage 41016 1727204187.45007: checking to see if all hosts have failed and the running result is not ok 41016 1727204187.45008: done checking to see if all hosts have failed 41016 1727204187.45011: getting the remaining hosts for this loop 41016 1727204187.45012: done getting the remaining hosts for this loop 41016 1727204187.45016: getting the next task for host managed-node1 41016 1727204187.45023: done getting next task for host managed-node1 41016 1727204187.45025: ^ task is: TASK: Set current_interfaces 41016 1727204187.45030: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204187.45146: getting variables 41016 1727204187.45148: in VariableManager get_vars() 41016 1727204187.45188: Calling all_inventory to load vars for managed-node1 41016 1727204187.45191: Calling groups_inventory to load vars for managed-node1 41016 1727204187.45193: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204187.45203: Calling all_plugins_play to load vars for managed-node1 41016 1727204187.45206: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204187.45211: Calling groups_plugins_play to load vars for managed-node1 41016 1727204187.45396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204187.45649: done with get_vars() 41016 1727204187.45660: done getting variables 41016 1727204187.45727: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 14:56:27 -0400 (0:00:00.385) 0:00:11.133 ***** 41016 1727204187.45757: entering _queue_task() for managed-node1/set_fact 41016 1727204187.46161: worker is 1 (out of 1 available) 41016 1727204187.46173: exiting _queue_task() for managed-node1/set_fact 41016 1727204187.46185: done queuing things up, now waiting for results queue to drain 41016 1727204187.46186: waiting for pending results... 41016 1727204187.46491: running TaskExecutor() for managed-node1/TASK: Set current_interfaces 41016 1727204187.46496: in run() - task 028d2410-947f-12d5-0ec4-0000000003ac 41016 1727204187.46507: variable 'ansible_search_path' from source: unknown 41016 1727204187.46520: variable 'ansible_search_path' from source: unknown 41016 1727204187.46556: calling self._execute() 41016 1727204187.46652: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204187.46662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204187.46674: variable 'omit' from source: magic vars 41016 1727204187.47068: variable 'ansible_distribution_major_version' from source: facts 41016 1727204187.47086: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204187.47133: variable 'omit' from source: magic vars 41016 1727204187.47164: variable 'omit' from source: magic vars 41016 1727204187.47283: variable '_current_interfaces' from source: set_fact 41016 1727204187.47355: variable 'omit' from source: magic vars 41016 1727204187.47406: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204187.47459: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204187.47478: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204187.47568: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204187.47572: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204187.47574: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204187.47578: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204187.47580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204187.47681: Set connection var ansible_shell_executable to /bin/sh 41016 1727204187.47693: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204187.47705: Set connection var ansible_shell_type to sh 41016 1727204187.47723: Set connection var ansible_timeout to 10 41016 1727204187.47733: Set connection var ansible_pipelining to False 41016 1727204187.47744: Set connection var ansible_connection to ssh 41016 1727204187.47769: variable 'ansible_shell_executable' from source: unknown 41016 1727204187.47782: variable 'ansible_connection' from source: unknown 41016 1727204187.47790: variable 'ansible_module_compression' from source: unknown 41016 1727204187.47797: variable 'ansible_shell_type' from source: unknown 41016 1727204187.47880: variable 'ansible_shell_executable' from source: unknown 41016 1727204187.47883: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204187.47887: variable 'ansible_pipelining' from source: unknown 41016 1727204187.47889: variable 'ansible_timeout' from source: unknown 41016 1727204187.47891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204187.48010: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204187.48017: variable 'omit' from source: magic vars 41016 1727204187.48019: starting attempt loop 41016 1727204187.48021: running the handler 41016 1727204187.48023: handler run complete 41016 1727204187.48024: attempt loop complete, returning result 41016 1727204187.48026: _execute() done 41016 1727204187.48027: dumping result to json 41016 1727204187.48034: done dumping result, returning 41016 1727204187.48044: done running TaskExecutor() for managed-node1/TASK: Set current_interfaces [028d2410-947f-12d5-0ec4-0000000003ac] 41016 1727204187.48051: sending task result for task 028d2410-947f-12d5-0ec4-0000000003ac 41016 1727204187.48385: done sending task result for task 028d2410-947f-12d5-0ec4-0000000003ac 41016 1727204187.48389: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "ethtest0", "lo", "peerethtest0" ] }, "changed": false } 41016 1727204187.48448: no more pending results, returning what we have 41016 1727204187.48451: results queue empty 41016 1727204187.48452: checking for any_errors_fatal 41016 1727204187.48458: done checking for any_errors_fatal 41016 1727204187.48459: checking for max_fail_percentage 41016 1727204187.48461: done checking for max_fail_percentage 41016 1727204187.48461: checking to see if all hosts have failed and the running result is not ok 41016 1727204187.48462: done checking to see if all hosts have failed 41016 1727204187.48463: getting the remaining hosts for this loop 41016 1727204187.48465: done getting the remaining hosts for this loop 41016 1727204187.48469: getting the next task for host managed-node1 41016 1727204187.48480: done getting next task for host managed-node1 41016 1727204187.48483: ^ task is: TASK: Show current_interfaces 41016 1727204187.48487: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204187.48491: getting variables 41016 1727204187.48492: in VariableManager get_vars() 41016 1727204187.48535: Calling all_inventory to load vars for managed-node1 41016 1727204187.48538: Calling groups_inventory to load vars for managed-node1 41016 1727204187.48541: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204187.48551: Calling all_plugins_play to load vars for managed-node1 41016 1727204187.48554: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204187.48557: Calling groups_plugins_play to load vars for managed-node1 41016 1727204187.48842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204187.49060: done with get_vars() 41016 1727204187.49072: done getting variables 41016 1727204187.49143: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 14:56:27 -0400 (0:00:00.034) 0:00:11.167 ***** 41016 1727204187.49177: entering _queue_task() for managed-node1/debug 41016 1727204187.49595: worker is 1 (out of 1 available) 41016 1727204187.49607: exiting _queue_task() for managed-node1/debug 41016 1727204187.49620: done queuing things up, now waiting for results queue to drain 41016 1727204187.49621: waiting for pending results... 41016 1727204187.49808: running TaskExecutor() for managed-node1/TASK: Show current_interfaces 41016 1727204187.49934: in run() - task 028d2410-947f-12d5-0ec4-000000000375 41016 1727204187.49962: variable 'ansible_search_path' from source: unknown 41016 1727204187.49970: variable 'ansible_search_path' from source: unknown 41016 1727204187.50021: calling self._execute() 41016 1727204187.50127: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204187.50138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204187.50173: variable 'omit' from source: magic vars 41016 1727204187.50563: variable 'ansible_distribution_major_version' from source: facts 41016 1727204187.50584: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204187.50597: variable 'omit' from source: magic vars 41016 1727204187.50680: variable 'omit' from source: magic vars 41016 1727204187.50765: variable 'current_interfaces' from source: set_fact 41016 1727204187.50798: variable 'omit' from source: magic vars 41016 1727204187.50845: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204187.50966: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204187.50969: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204187.50971: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204187.50973: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204187.50976: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204187.50985: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204187.50991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204187.51093: Set connection var ansible_shell_executable to /bin/sh 41016 1727204187.51103: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204187.51115: Set connection var ansible_shell_type to sh 41016 1727204187.51123: Set connection var ansible_timeout to 10 41016 1727204187.51130: Set connection var ansible_pipelining to False 41016 1727204187.51139: Set connection var ansible_connection to ssh 41016 1727204187.51161: variable 'ansible_shell_executable' from source: unknown 41016 1727204187.51168: variable 'ansible_connection' from source: unknown 41016 1727204187.51173: variable 'ansible_module_compression' from source: unknown 41016 1727204187.51184: variable 'ansible_shell_type' from source: unknown 41016 1727204187.51194: variable 'ansible_shell_executable' from source: unknown 41016 1727204187.51280: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204187.51283: variable 'ansible_pipelining' from source: unknown 41016 1727204187.51289: variable 'ansible_timeout' from source: unknown 41016 1727204187.51291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204187.51362: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204187.51380: variable 'omit' from source: magic vars 41016 1727204187.51396: starting attempt loop 41016 1727204187.51405: running the handler 41016 1727204187.51463: handler run complete 41016 1727204187.51485: attempt loop complete, returning result 41016 1727204187.51492: _execute() done 41016 1727204187.51498: dumping result to json 41016 1727204187.51515: done dumping result, returning 41016 1727204187.51531: done running TaskExecutor() for managed-node1/TASK: Show current_interfaces [028d2410-947f-12d5-0ec4-000000000375] 41016 1727204187.51540: sending task result for task 028d2410-947f-12d5-0ec4-000000000375 41016 1727204187.51750: done sending task result for task 028d2410-947f-12d5-0ec4-000000000375 41016 1727204187.51753: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'ethtest0', 'lo', 'peerethtest0'] 41016 1727204187.51805: no more pending results, returning what we have 41016 1727204187.51812: results queue empty 41016 1727204187.51813: checking for any_errors_fatal 41016 1727204187.51819: done checking for any_errors_fatal 41016 1727204187.51820: checking for max_fail_percentage 41016 1727204187.51821: done checking for max_fail_percentage 41016 1727204187.51823: checking to see if all hosts have failed and the running result is not ok 41016 1727204187.51824: done checking to see if all hosts have failed 41016 1727204187.51825: getting the remaining hosts for this loop 41016 1727204187.51826: done getting the remaining hosts for this loop 41016 1727204187.51834: getting the next task for host managed-node1 41016 1727204187.51845: done getting next task for host managed-node1 41016 1727204187.51848: ^ task is: TASK: Install iproute 41016 1727204187.51851: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204187.51856: getting variables 41016 1727204187.51858: in VariableManager get_vars() 41016 1727204187.51905: Calling all_inventory to load vars for managed-node1 41016 1727204187.51911: Calling groups_inventory to load vars for managed-node1 41016 1727204187.51915: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204187.51925: Calling all_plugins_play to load vars for managed-node1 41016 1727204187.51929: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204187.51932: Calling groups_plugins_play to load vars for managed-node1 41016 1727204187.52396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204187.52596: done with get_vars() 41016 1727204187.52607: done getting variables 41016 1727204187.52665: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Tuesday 24 September 2024 14:56:27 -0400 (0:00:00.035) 0:00:11.203 ***** 41016 1727204187.52699: entering _queue_task() for managed-node1/package 41016 1727204187.52997: worker is 1 (out of 1 available) 41016 1727204187.53012: exiting _queue_task() for managed-node1/package 41016 1727204187.53136: done queuing things up, now waiting for results queue to drain 41016 1727204187.53138: waiting for pending results... 41016 1727204187.53364: running TaskExecutor() for managed-node1/TASK: Install iproute 41016 1727204187.53461: in run() - task 028d2410-947f-12d5-0ec4-0000000002ff 41016 1727204187.53466: variable 'ansible_search_path' from source: unknown 41016 1727204187.53469: variable 'ansible_search_path' from source: unknown 41016 1727204187.53492: calling self._execute() 41016 1727204187.53595: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204187.53607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204187.53624: variable 'omit' from source: magic vars 41016 1727204187.54081: variable 'ansible_distribution_major_version' from source: facts 41016 1727204187.54084: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204187.54087: variable 'omit' from source: magic vars 41016 1727204187.54099: variable 'omit' from source: magic vars 41016 1727204187.54316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41016 1727204187.56538: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41016 1727204187.56629: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41016 1727204187.56668: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41016 1727204187.56879: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41016 1727204187.56884: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41016 1727204187.56887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204187.56890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204187.56907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204187.56954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204187.56971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204187.57086: variable '__network_is_ostree' from source: set_fact 41016 1727204187.57096: variable 'omit' from source: magic vars 41016 1727204187.57139: variable 'omit' from source: magic vars 41016 1727204187.57169: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204187.57201: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204187.57232: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204187.57252: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204187.57266: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204187.57299: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204187.57307: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204187.57317: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204187.57423: Set connection var ansible_shell_executable to /bin/sh 41016 1727204187.57433: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204187.57451: Set connection var ansible_shell_type to sh 41016 1727204187.57461: Set connection var ansible_timeout to 10 41016 1727204187.57469: Set connection var ansible_pipelining to False 41016 1727204187.57482: Set connection var ansible_connection to ssh 41016 1727204187.57552: variable 'ansible_shell_executable' from source: unknown 41016 1727204187.57554: variable 'ansible_connection' from source: unknown 41016 1727204187.57557: variable 'ansible_module_compression' from source: unknown 41016 1727204187.57559: variable 'ansible_shell_type' from source: unknown 41016 1727204187.57561: variable 'ansible_shell_executable' from source: unknown 41016 1727204187.57563: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204187.57564: variable 'ansible_pipelining' from source: unknown 41016 1727204187.57566: variable 'ansible_timeout' from source: unknown 41016 1727204187.57568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204187.57663: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204187.57681: variable 'omit' from source: magic vars 41016 1727204187.57691: starting attempt loop 41016 1727204187.57697: running the handler 41016 1727204187.57707: variable 'ansible_facts' from source: unknown 41016 1727204187.57770: variable 'ansible_facts' from source: unknown 41016 1727204187.57772: _low_level_execute_command(): starting 41016 1727204187.57775: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204187.58541: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204187.58607: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204187.58637: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204187.58704: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204187.58823: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204187.60603: stdout chunk (state=3): >>>/root <<< 41016 1727204187.60763: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204187.60766: stdout chunk (state=3): >>><<< 41016 1727204187.60769: stderr chunk (state=3): >>><<< 41016 1727204187.60790: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204187.60895: _low_level_execute_command(): starting 41016 1727204187.60900: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204187.6080205-42143-222217286708924 `" && echo ansible-tmp-1727204187.6080205-42143-222217286708924="` echo /root/.ansible/tmp/ansible-tmp-1727204187.6080205-42143-222217286708924 `" ) && sleep 0' 41016 1727204187.61485: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204187.61502: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204187.61528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204187.61595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204187.61661: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204187.61685: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204187.61711: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204187.61821: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204187.63928: stdout chunk (state=3): >>>ansible-tmp-1727204187.6080205-42143-222217286708924=/root/.ansible/tmp/ansible-tmp-1727204187.6080205-42143-222217286708924 <<< 41016 1727204187.64082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204187.64085: stdout chunk (state=3): >>><<< 41016 1727204187.64101: stderr chunk (state=3): >>><<< 41016 1727204187.64281: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204187.6080205-42143-222217286708924=/root/.ansible/tmp/ansible-tmp-1727204187.6080205-42143-222217286708924 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204187.64284: variable 'ansible_module_compression' from source: unknown 41016 1727204187.64287: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-410168h8uvyln/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 41016 1727204187.64289: variable 'ansible_facts' from source: unknown 41016 1727204187.64408: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204187.6080205-42143-222217286708924/AnsiballZ_dnf.py 41016 1727204187.64644: Sending initial data 41016 1727204187.64646: Sent initial data (152 bytes) 41016 1727204187.65205: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204187.65222: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204187.65237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204187.65256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204187.65290: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204187.65387: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204187.65410: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204187.65429: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204187.65547: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204187.67272: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204187.67377: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204187.67452: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmpgw8xswn6 /root/.ansible/tmp/ansible-tmp-1727204187.6080205-42143-222217286708924/AnsiballZ_dnf.py <<< 41016 1727204187.67455: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204187.6080205-42143-222217286708924/AnsiballZ_dnf.py" <<< 41016 1727204187.67521: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmpgw8xswn6" to remote "/root/.ansible/tmp/ansible-tmp-1727204187.6080205-42143-222217286708924/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204187.6080205-42143-222217286708924/AnsiballZ_dnf.py" <<< 41016 1727204187.68749: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204187.68792: stderr chunk (state=3): >>><<< 41016 1727204187.68885: stdout chunk (state=3): >>><<< 41016 1727204187.68888: done transferring module to remote 41016 1727204187.68891: _low_level_execute_command(): starting 41016 1727204187.68893: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204187.6080205-42143-222217286708924/ /root/.ansible/tmp/ansible-tmp-1727204187.6080205-42143-222217286708924/AnsiballZ_dnf.py && sleep 0' 41016 1727204187.69894: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204187.69898: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204187.70010: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204187.72082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204187.72086: stderr chunk (state=3): >>><<< 41016 1727204187.72088: stdout chunk (state=3): >>><<< 41016 1727204187.72091: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204187.72093: _low_level_execute_command(): starting 41016 1727204187.72096: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204187.6080205-42143-222217286708924/AnsiballZ_dnf.py && sleep 0' 41016 1727204187.72795: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 41016 1727204187.72878: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204187.72904: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204187.73022: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204188.18855: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 41016 1727204188.24066: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 41016 1727204188.24105: stderr chunk (state=3): >>><<< 41016 1727204188.24108: stdout chunk (state=3): >>><<< 41016 1727204188.24127: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204188.24221: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204187.6080205-42143-222217286708924/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204188.24229: _low_level_execute_command(): starting 41016 1727204188.24232: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204187.6080205-42143-222217286708924/ > /dev/null 2>&1 && sleep 0' 41016 1727204188.24816: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204188.24820: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204188.24822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204188.24824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204188.24925: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204188.24928: stderr chunk (state=3): >>>debug2: match not found <<< 41016 1727204188.24930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204188.24932: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41016 1727204188.24934: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 41016 1727204188.24936: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41016 1727204188.24938: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204188.24939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204188.24941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204188.24972: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204188.24990: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204188.25002: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204188.25113: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204188.27123: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204188.27157: stderr chunk (state=3): >>><<< 41016 1727204188.27160: stdout chunk (state=3): >>><<< 41016 1727204188.27180: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204188.27381: handler run complete 41016 1727204188.27384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41016 1727204188.27545: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41016 1727204188.27616: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41016 1727204188.27652: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41016 1727204188.27691: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41016 1727204188.27779: variable '__install_status' from source: set_fact 41016 1727204188.27803: Evaluated conditional (__install_status is success): True 41016 1727204188.27830: attempt loop complete, returning result 41016 1727204188.27838: _execute() done 41016 1727204188.27845: dumping result to json 41016 1727204188.27854: done dumping result, returning 41016 1727204188.27864: done running TaskExecutor() for managed-node1/TASK: Install iproute [028d2410-947f-12d5-0ec4-0000000002ff] 41016 1727204188.27880: sending task result for task 028d2410-947f-12d5-0ec4-0000000002ff ok: [managed-node1] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 41016 1727204188.28369: no more pending results, returning what we have 41016 1727204188.28373: results queue empty 41016 1727204188.28377: checking for any_errors_fatal 41016 1727204188.28385: done checking for any_errors_fatal 41016 1727204188.28386: checking for max_fail_percentage 41016 1727204188.28389: done checking for max_fail_percentage 41016 1727204188.28390: checking to see if all hosts have failed and the running result is not ok 41016 1727204188.28390: done checking to see if all hosts have failed 41016 1727204188.28391: getting the remaining hosts for this loop 41016 1727204188.28393: done getting the remaining hosts for this loop 41016 1727204188.28397: getting the next task for host managed-node1 41016 1727204188.28405: done getting next task for host managed-node1 41016 1727204188.28408: ^ task is: TASK: Create veth interface {{ interface }} 41016 1727204188.28411: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204188.28414: getting variables 41016 1727204188.28416: in VariableManager get_vars() 41016 1727204188.28902: Calling all_inventory to load vars for managed-node1 41016 1727204188.28904: Calling groups_inventory to load vars for managed-node1 41016 1727204188.28906: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204188.28916: Calling all_plugins_play to load vars for managed-node1 41016 1727204188.28918: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204188.28921: Calling groups_plugins_play to load vars for managed-node1 41016 1727204188.29082: done sending task result for task 028d2410-947f-12d5-0ec4-0000000002ff 41016 1727204188.29086: WORKER PROCESS EXITING 41016 1727204188.29103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204188.29304: done with get_vars() 41016 1727204188.29316: done getting variables 41016 1727204188.29373: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41016 1727204188.29493: variable 'interface' from source: set_fact TASK [Create veth interface ethtest1] ****************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Tuesday 24 September 2024 14:56:28 -0400 (0:00:00.768) 0:00:11.971 ***** 41016 1727204188.29527: entering _queue_task() for managed-node1/command 41016 1727204188.29986: worker is 1 (out of 1 available) 41016 1727204188.29995: exiting _queue_task() for managed-node1/command 41016 1727204188.30006: done queuing things up, now waiting for results queue to drain 41016 1727204188.30007: waiting for pending results... 41016 1727204188.30074: running TaskExecutor() for managed-node1/TASK: Create veth interface ethtest1 41016 1727204188.30193: in run() - task 028d2410-947f-12d5-0ec4-000000000300 41016 1727204188.30217: variable 'ansible_search_path' from source: unknown 41016 1727204188.30226: variable 'ansible_search_path' from source: unknown 41016 1727204188.30507: variable 'interface' from source: set_fact 41016 1727204188.30595: variable 'interface' from source: set_fact 41016 1727204188.30679: variable 'interface' from source: set_fact 41016 1727204188.30825: Loaded config def from plugin (lookup/items) 41016 1727204188.30838: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 41016 1727204188.30862: variable 'omit' from source: magic vars 41016 1727204188.30968: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204188.30986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204188.31181: variable 'omit' from source: magic vars 41016 1727204188.31299: variable 'ansible_distribution_major_version' from source: facts 41016 1727204188.31310: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204188.31492: variable 'type' from source: set_fact 41016 1727204188.31502: variable 'state' from source: include params 41016 1727204188.31515: variable 'interface' from source: set_fact 41016 1727204188.31524: variable 'current_interfaces' from source: set_fact 41016 1727204188.31535: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 41016 1727204188.31544: variable 'omit' from source: magic vars 41016 1727204188.31582: variable 'omit' from source: magic vars 41016 1727204188.31637: variable 'item' from source: unknown 41016 1727204188.31707: variable 'item' from source: unknown 41016 1727204188.31731: variable 'omit' from source: magic vars 41016 1727204188.31763: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204188.31800: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204188.31822: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204188.31848: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204188.31861: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204188.31892: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204188.31899: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204188.31906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204188.32015: Set connection var ansible_shell_executable to /bin/sh 41016 1727204188.32030: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204188.32045: Set connection var ansible_shell_type to sh 41016 1727204188.32058: Set connection var ansible_timeout to 10 41016 1727204188.32069: Set connection var ansible_pipelining to False 41016 1727204188.32162: Set connection var ansible_connection to ssh 41016 1727204188.32165: variable 'ansible_shell_executable' from source: unknown 41016 1727204188.32167: variable 'ansible_connection' from source: unknown 41016 1727204188.32169: variable 'ansible_module_compression' from source: unknown 41016 1727204188.32170: variable 'ansible_shell_type' from source: unknown 41016 1727204188.32172: variable 'ansible_shell_executable' from source: unknown 41016 1727204188.32173: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204188.32176: variable 'ansible_pipelining' from source: unknown 41016 1727204188.32178: variable 'ansible_timeout' from source: unknown 41016 1727204188.32180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204188.32272: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204188.32291: variable 'omit' from source: magic vars 41016 1727204188.32300: starting attempt loop 41016 1727204188.32307: running the handler 41016 1727204188.32323: _low_level_execute_command(): starting 41016 1727204188.32333: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204188.33565: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204188.33730: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204188.33844: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204188.35629: stdout chunk (state=3): >>>/root <<< 41016 1727204188.35765: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204188.35769: stdout chunk (state=3): >>><<< 41016 1727204188.35780: stderr chunk (state=3): >>><<< 41016 1727204188.35846: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204188.35860: _low_level_execute_command(): starting 41016 1727204188.35867: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204188.3584597-42292-50820248534468 `" && echo ansible-tmp-1727204188.3584597-42292-50820248534468="` echo /root/.ansible/tmp/ansible-tmp-1727204188.3584597-42292-50820248534468 `" ) && sleep 0' 41016 1727204188.37043: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204188.37048: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204188.37051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204188.37054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204188.37056: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204188.37058: stderr chunk (state=3): >>>debug2: match not found <<< 41016 1727204188.37061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204188.37063: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41016 1727204188.37065: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 41016 1727204188.37067: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41016 1727204188.37069: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204188.37071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204188.37073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204188.37078: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204188.37080: stderr chunk (state=3): >>>debug2: match found <<< 41016 1727204188.37088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204188.37158: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204188.37194: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204188.37292: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204188.39429: stdout chunk (state=3): >>>ansible-tmp-1727204188.3584597-42292-50820248534468=/root/.ansible/tmp/ansible-tmp-1727204188.3584597-42292-50820248534468 <<< 41016 1727204188.39684: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204188.39690: stdout chunk (state=3): >>><<< 41016 1727204188.39692: stderr chunk (state=3): >>><<< 41016 1727204188.39694: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204188.3584597-42292-50820248534468=/root/.ansible/tmp/ansible-tmp-1727204188.3584597-42292-50820248534468 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204188.39696: variable 'ansible_module_compression' from source: unknown 41016 1727204188.39726: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-410168h8uvyln/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41016 1727204188.39761: variable 'ansible_facts' from source: unknown 41016 1727204188.39907: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204188.3584597-42292-50820248534468/AnsiballZ_command.py 41016 1727204188.40247: Sending initial data 41016 1727204188.40251: Sent initial data (155 bytes) 41016 1727204188.40718: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204188.40779: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204188.40797: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204188.40873: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204188.42686: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204188.42756: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204188.42858: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmpj_yc_hly /root/.ansible/tmp/ansible-tmp-1727204188.3584597-42292-50820248534468/AnsiballZ_command.py <<< 41016 1727204188.42862: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204188.3584597-42292-50820248534468/AnsiballZ_command.py" <<< 41016 1727204188.42925: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmpj_yc_hly" to remote "/root/.ansible/tmp/ansible-tmp-1727204188.3584597-42292-50820248534468/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204188.3584597-42292-50820248534468/AnsiballZ_command.py" <<< 41016 1727204188.43794: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204188.43884: stderr chunk (state=3): >>><<< 41016 1727204188.43888: stdout chunk (state=3): >>><<< 41016 1727204188.43890: done transferring module to remote 41016 1727204188.43892: _low_level_execute_command(): starting 41016 1727204188.43894: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204188.3584597-42292-50820248534468/ /root/.ansible/tmp/ansible-tmp-1727204188.3584597-42292-50820248534468/AnsiballZ_command.py && sleep 0' 41016 1727204188.44251: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204188.44263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204188.44275: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204188.44320: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204188.44334: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204188.44419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204188.46476: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204188.46481: stdout chunk (state=3): >>><<< 41016 1727204188.46483: stderr chunk (state=3): >>><<< 41016 1727204188.46494: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204188.46497: _low_level_execute_command(): starting 41016 1727204188.46504: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204188.3584597-42292-50820248534468/AnsiballZ_command.py && sleep 0' 41016 1727204188.46974: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204188.46979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204188.46982: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204188.46984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 41016 1727204188.46986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204188.47035: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204188.47040: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204188.47127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204188.64543: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest1", "type", "veth", "peer", "name", "peerethtest1"], "start": "2024-09-24 14:56:28.634971", "end": "2024-09-24 14:56:28.641720", "delta": "0:00:00.006749", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest1 type veth peer name peerethtest1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41016 1727204188.68068: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 41016 1727204188.68072: stdout chunk (state=3): >>><<< 41016 1727204188.68076: stderr chunk (state=3): >>><<< 41016 1727204188.68096: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest1", "type", "veth", "peer", "name", "peerethtest1"], "start": "2024-09-24 14:56:28.634971", "end": "2024-09-24 14:56:28.641720", "delta": "0:00:00.006749", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest1 type veth peer name peerethtest1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204188.68126: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add ethtest1 type veth peer name peerethtest1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204188.3584597-42292-50820248534468/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204188.68134: _low_level_execute_command(): starting 41016 1727204188.68137: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204188.3584597-42292-50820248534468/ > /dev/null 2>&1 && sleep 0' 41016 1727204188.68567: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204188.68571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 41016 1727204188.68601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204188.68604: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 41016 1727204188.68606: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204188.68611: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204188.68666: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204188.68669: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204188.68675: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204188.68761: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204188.73255: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204188.73259: stdout chunk (state=3): >>><<< 41016 1727204188.73262: stderr chunk (state=3): >>><<< 41016 1727204188.73280: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204188.73286: handler run complete 41016 1727204188.73303: Evaluated conditional (False): False 41016 1727204188.73313: attempt loop complete, returning result 41016 1727204188.73328: variable 'item' from source: unknown 41016 1727204188.73396: variable 'item' from source: unknown ok: [managed-node1] => (item=ip link add ethtest1 type veth peer name peerethtest1) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "ethtest1", "type", "veth", "peer", "name", "peerethtest1" ], "delta": "0:00:00.006749", "end": "2024-09-24 14:56:28.641720", "item": "ip link add ethtest1 type veth peer name peerethtest1", "rc": 0, "start": "2024-09-24 14:56:28.634971" } 41016 1727204188.73569: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204188.73573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204188.73587: variable 'omit' from source: magic vars 41016 1727204188.73642: variable 'ansible_distribution_major_version' from source: facts 41016 1727204188.73645: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204188.73768: variable 'type' from source: set_fact 41016 1727204188.73772: variable 'state' from source: include params 41016 1727204188.73775: variable 'interface' from source: set_fact 41016 1727204188.73779: variable 'current_interfaces' from source: set_fact 41016 1727204188.73785: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 41016 1727204188.73789: variable 'omit' from source: magic vars 41016 1727204188.73805: variable 'omit' from source: magic vars 41016 1727204188.73832: variable 'item' from source: unknown 41016 1727204188.73874: variable 'item' from source: unknown 41016 1727204188.73887: variable 'omit' from source: magic vars 41016 1727204188.73914: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204188.73918: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204188.73920: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204188.73930: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204188.73932: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204188.73935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204188.73984: Set connection var ansible_shell_executable to /bin/sh 41016 1727204188.73987: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204188.73993: Set connection var ansible_shell_type to sh 41016 1727204188.73998: Set connection var ansible_timeout to 10 41016 1727204188.74002: Set connection var ansible_pipelining to False 41016 1727204188.74015: Set connection var ansible_connection to ssh 41016 1727204188.74027: variable 'ansible_shell_executable' from source: unknown 41016 1727204188.74030: variable 'ansible_connection' from source: unknown 41016 1727204188.74032: variable 'ansible_module_compression' from source: unknown 41016 1727204188.74034: variable 'ansible_shell_type' from source: unknown 41016 1727204188.74036: variable 'ansible_shell_executable' from source: unknown 41016 1727204188.74038: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204188.74043: variable 'ansible_pipelining' from source: unknown 41016 1727204188.74045: variable 'ansible_timeout' from source: unknown 41016 1727204188.74049: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204188.74115: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204188.74127: variable 'omit' from source: magic vars 41016 1727204188.74134: starting attempt loop 41016 1727204188.74137: running the handler 41016 1727204188.74139: _low_level_execute_command(): starting 41016 1727204188.74141: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204188.74605: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204188.74613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204188.74615: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204188.74618: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204188.74620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204188.74673: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204188.74677: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204188.74687: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204188.74762: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204188.76551: stdout chunk (state=3): >>>/root <<< 41016 1727204188.76654: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204188.76685: stderr chunk (state=3): >>><<< 41016 1727204188.76688: stdout chunk (state=3): >>><<< 41016 1727204188.76702: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204188.76712: _low_level_execute_command(): starting 41016 1727204188.76718: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204188.7670283-42292-34018649675953 `" && echo ansible-tmp-1727204188.7670283-42292-34018649675953="` echo /root/.ansible/tmp/ansible-tmp-1727204188.7670283-42292-34018649675953 `" ) && sleep 0' 41016 1727204188.77143: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204188.77151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204188.77184: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204188.77187: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204188.77190: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204188.77238: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204188.77242: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204188.77247: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204188.77327: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204188.79434: stdout chunk (state=3): >>>ansible-tmp-1727204188.7670283-42292-34018649675953=/root/.ansible/tmp/ansible-tmp-1727204188.7670283-42292-34018649675953 <<< 41016 1727204188.79546: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204188.79573: stderr chunk (state=3): >>><<< 41016 1727204188.79578: stdout chunk (state=3): >>><<< 41016 1727204188.79591: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204188.7670283-42292-34018649675953=/root/.ansible/tmp/ansible-tmp-1727204188.7670283-42292-34018649675953 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204188.79614: variable 'ansible_module_compression' from source: unknown 41016 1727204188.79646: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-410168h8uvyln/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41016 1727204188.79664: variable 'ansible_facts' from source: unknown 41016 1727204188.79708: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204188.7670283-42292-34018649675953/AnsiballZ_command.py 41016 1727204188.79801: Sending initial data 41016 1727204188.79804: Sent initial data (155 bytes) 41016 1727204188.80237: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204188.80245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204188.80267: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204188.80270: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204188.80280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204188.80325: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204188.80336: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204188.80424: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204188.82148: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41016 1727204188.82154: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204188.82227: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204188.82306: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmp0hyy5kqw /root/.ansible/tmp/ansible-tmp-1727204188.7670283-42292-34018649675953/AnsiballZ_command.py <<< 41016 1727204188.82311: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204188.7670283-42292-34018649675953/AnsiballZ_command.py" <<< 41016 1727204188.82378: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmp0hyy5kqw" to remote "/root/.ansible/tmp/ansible-tmp-1727204188.7670283-42292-34018649675953/AnsiballZ_command.py" <<< 41016 1727204188.82382: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204188.7670283-42292-34018649675953/AnsiballZ_command.py" <<< 41016 1727204188.83072: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204188.83088: stderr chunk (state=3): >>><<< 41016 1727204188.83091: stdout chunk (state=3): >>><<< 41016 1727204188.83136: done transferring module to remote 41016 1727204188.83139: _low_level_execute_command(): starting 41016 1727204188.83141: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204188.7670283-42292-34018649675953/ /root/.ansible/tmp/ansible-tmp-1727204188.7670283-42292-34018649675953/AnsiballZ_command.py && sleep 0' 41016 1727204188.83664: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204188.83668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204188.83670: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204188.83672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204188.83720: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204188.83723: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204188.83816: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204188.85779: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204188.85800: stderr chunk (state=3): >>><<< 41016 1727204188.85803: stdout chunk (state=3): >>><<< 41016 1727204188.85819: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204188.85824: _low_level_execute_command(): starting 41016 1727204188.85829: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204188.7670283-42292-34018649675953/AnsiballZ_command.py && sleep 0' 41016 1727204188.86455: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204188.86467: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 41016 1727204188.86571: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204188.86594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204188.86611: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204188.86628: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204188.86653: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204188.86791: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204189.03741: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest1", "up"], "start": "2024-09-24 14:56:29.031464", "end": "2024-09-24 14:56:29.035618", "delta": "0:00:00.004154", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest1 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41016 1727204189.05884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 41016 1727204189.05888: stdout chunk (state=3): >>><<< 41016 1727204189.05891: stderr chunk (state=3): >>><<< 41016 1727204189.05894: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest1", "up"], "start": "2024-09-24 14:56:29.031464", "end": "2024-09-24 14:56:29.035618", "delta": "0:00:00.004154", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest1 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204189.05896: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerethtest1 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204188.7670283-42292-34018649675953/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204189.05898: _low_level_execute_command(): starting 41016 1727204189.05900: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204188.7670283-42292-34018649675953/ > /dev/null 2>&1 && sleep 0' 41016 1727204189.06567: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204189.06573: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204189.06586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204189.06599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204189.06609: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204189.06619: stderr chunk (state=3): >>>debug2: match not found <<< 41016 1727204189.06628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204189.06648: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41016 1727204189.06692: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204189.06745: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204189.06782: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204189.06786: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204189.06873: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204189.09025: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204189.09029: stdout chunk (state=3): >>><<< 41016 1727204189.09030: stderr chunk (state=3): >>><<< 41016 1727204189.09032: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204189.09034: handler run complete 41016 1727204189.09133: Evaluated conditional (False): False 41016 1727204189.09135: attempt loop complete, returning result 41016 1727204189.09149: variable 'item' from source: unknown 41016 1727204189.09328: variable 'item' from source: unknown ok: [managed-node1] => (item=ip link set peerethtest1 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerethtest1", "up" ], "delta": "0:00:00.004154", "end": "2024-09-24 14:56:29.035618", "item": "ip link set peerethtest1 up", "rc": 0, "start": "2024-09-24 14:56:29.031464" } 41016 1727204189.09766: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204189.09770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204189.09772: variable 'omit' from source: magic vars 41016 1727204189.10181: variable 'ansible_distribution_major_version' from source: facts 41016 1727204189.10185: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204189.10553: variable 'type' from source: set_fact 41016 1727204189.10563: variable 'state' from source: include params 41016 1727204189.10572: variable 'interface' from source: set_fact 41016 1727204189.10583: variable 'current_interfaces' from source: set_fact 41016 1727204189.10856: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 41016 1727204189.10860: variable 'omit' from source: magic vars 41016 1727204189.10862: variable 'omit' from source: magic vars 41016 1727204189.10864: variable 'item' from source: unknown 41016 1727204189.10914: variable 'item' from source: unknown 41016 1727204189.10936: variable 'omit' from source: magic vars 41016 1727204189.10991: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204189.11089: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204189.11100: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204189.11122: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204189.11129: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204189.11136: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204189.11333: Set connection var ansible_shell_executable to /bin/sh 41016 1727204189.11343: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204189.11352: Set connection var ansible_shell_type to sh 41016 1727204189.11362: Set connection var ansible_timeout to 10 41016 1727204189.11415: Set connection var ansible_pipelining to False 41016 1727204189.11427: Set connection var ansible_connection to ssh 41016 1727204189.11450: variable 'ansible_shell_executable' from source: unknown 41016 1727204189.11619: variable 'ansible_connection' from source: unknown 41016 1727204189.11622: variable 'ansible_module_compression' from source: unknown 41016 1727204189.11625: variable 'ansible_shell_type' from source: unknown 41016 1727204189.11627: variable 'ansible_shell_executable' from source: unknown 41016 1727204189.11628: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204189.11630: variable 'ansible_pipelining' from source: unknown 41016 1727204189.11632: variable 'ansible_timeout' from source: unknown 41016 1727204189.11634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204189.11751: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204189.11764: variable 'omit' from source: magic vars 41016 1727204189.11930: starting attempt loop 41016 1727204189.11934: running the handler 41016 1727204189.11936: _low_level_execute_command(): starting 41016 1727204189.11938: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204189.12841: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204189.12844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204189.12847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204189.13269: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204189.13273: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204189.13377: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204189.15188: stdout chunk (state=3): >>>/root <<< 41016 1727204189.15344: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204189.15348: stdout chunk (state=3): >>><<< 41016 1727204189.15350: stderr chunk (state=3): >>><<< 41016 1727204189.15467: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204189.15471: _low_level_execute_command(): starting 41016 1727204189.15474: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204189.1537256-42292-189691476972935 `" && echo ansible-tmp-1727204189.1537256-42292-189691476972935="` echo /root/.ansible/tmp/ansible-tmp-1727204189.1537256-42292-189691476972935 `" ) && sleep 0' 41016 1727204189.16049: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204189.16064: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204189.16088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204189.16131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204189.16142: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41016 1727204189.16240: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204189.16262: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204189.16372: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204189.18463: stdout chunk (state=3): >>>ansible-tmp-1727204189.1537256-42292-189691476972935=/root/.ansible/tmp/ansible-tmp-1727204189.1537256-42292-189691476972935 <<< 41016 1727204189.18632: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204189.18657: stderr chunk (state=3): >>><<< 41016 1727204189.18665: stdout chunk (state=3): >>><<< 41016 1727204189.18881: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204189.1537256-42292-189691476972935=/root/.ansible/tmp/ansible-tmp-1727204189.1537256-42292-189691476972935 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204189.18885: variable 'ansible_module_compression' from source: unknown 41016 1727204189.18887: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-410168h8uvyln/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41016 1727204189.18890: variable 'ansible_facts' from source: unknown 41016 1727204189.18892: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204189.1537256-42292-189691476972935/AnsiballZ_command.py 41016 1727204189.19030: Sending initial data 41016 1727204189.19039: Sent initial data (156 bytes) 41016 1727204189.19638: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204189.19647: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204189.19749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204189.19774: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204189.19879: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204189.21611: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204189.21707: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204189.21824: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmpifdkhyoz /root/.ansible/tmp/ansible-tmp-1727204189.1537256-42292-189691476972935/AnsiballZ_command.py <<< 41016 1727204189.21827: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204189.1537256-42292-189691476972935/AnsiballZ_command.py" <<< 41016 1727204189.21892: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmpifdkhyoz" to remote "/root/.ansible/tmp/ansible-tmp-1727204189.1537256-42292-189691476972935/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204189.1537256-42292-189691476972935/AnsiballZ_command.py" <<< 41016 1727204189.22737: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204189.22931: stderr chunk (state=3): >>><<< 41016 1727204189.22934: stdout chunk (state=3): >>><<< 41016 1727204189.22935: done transferring module to remote 41016 1727204189.22937: _low_level_execute_command(): starting 41016 1727204189.22939: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204189.1537256-42292-189691476972935/ /root/.ansible/tmp/ansible-tmp-1727204189.1537256-42292-189691476972935/AnsiballZ_command.py && sleep 0' 41016 1727204189.23539: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204189.23553: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204189.23590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204189.23602: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41016 1727204189.23625: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 41016 1727204189.23715: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204189.23746: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204189.23854: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204189.25871: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204189.25982: stdout chunk (state=3): >>><<< 41016 1727204189.25998: stderr chunk (state=3): >>><<< 41016 1727204189.26001: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204189.26007: _low_level_execute_command(): starting 41016 1727204189.26013: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204189.1537256-42292-189691476972935/AnsiballZ_command.py && sleep 0' 41016 1727204189.26648: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204189.26666: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204189.26730: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204189.26753: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204189.26784: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204189.26910: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204189.43596: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest1", "up"], "start": "2024-09-24 14:56:29.429891", "end": "2024-09-24 14:56:29.433957", "delta": "0:00:00.004066", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest1 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41016 1727204189.45441: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204189.45477: stderr chunk (state=3): >>>Shared connection to 10.31.14.47 closed. <<< 41016 1727204189.45481: stdout chunk (state=3): >>><<< 41016 1727204189.45483: stderr chunk (state=3): >>><<< 41016 1727204189.45618: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest1", "up"], "start": "2024-09-24 14:56:29.429891", "end": "2024-09-24 14:56:29.433957", "delta": "0:00:00.004066", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest1 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204189.45622: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set ethtest1 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204189.1537256-42292-189691476972935/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204189.45625: _low_level_execute_command(): starting 41016 1727204189.45627: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204189.1537256-42292-189691476972935/ > /dev/null 2>&1 && sleep 0' 41016 1727204189.46239: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204189.46254: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204189.46299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204189.46317: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41016 1727204189.46411: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204189.46439: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204189.46462: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204189.46564: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204189.48571: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204189.48583: stdout chunk (state=3): >>><<< 41016 1727204189.48589: stderr chunk (state=3): >>><<< 41016 1727204189.48604: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204189.48613: handler run complete 41016 1727204189.48628: Evaluated conditional (False): False 41016 1727204189.48636: attempt loop complete, returning result 41016 1727204189.48650: variable 'item' from source: unknown 41016 1727204189.48712: variable 'item' from source: unknown ok: [managed-node1] => (item=ip link set ethtest1 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "ethtest1", "up" ], "delta": "0:00:00.004066", "end": "2024-09-24 14:56:29.433957", "item": "ip link set ethtest1 up", "rc": 0, "start": "2024-09-24 14:56:29.429891" } 41016 1727204189.48828: dumping result to json 41016 1727204189.48831: done dumping result, returning 41016 1727204189.48833: done running TaskExecutor() for managed-node1/TASK: Create veth interface ethtest1 [028d2410-947f-12d5-0ec4-000000000300] 41016 1727204189.48836: sending task result for task 028d2410-947f-12d5-0ec4-000000000300 41016 1727204189.48947: done sending task result for task 028d2410-947f-12d5-0ec4-000000000300 41016 1727204189.48949: WORKER PROCESS EXITING 41016 1727204189.49013: no more pending results, returning what we have 41016 1727204189.49017: results queue empty 41016 1727204189.49018: checking for any_errors_fatal 41016 1727204189.49021: done checking for any_errors_fatal 41016 1727204189.49022: checking for max_fail_percentage 41016 1727204189.49025: done checking for max_fail_percentage 41016 1727204189.49028: checking to see if all hosts have failed and the running result is not ok 41016 1727204189.49028: done checking to see if all hosts have failed 41016 1727204189.49029: getting the remaining hosts for this loop 41016 1727204189.49030: done getting the remaining hosts for this loop 41016 1727204189.49033: getting the next task for host managed-node1 41016 1727204189.49039: done getting next task for host managed-node1 41016 1727204189.49041: ^ task is: TASK: Set up veth as managed by NetworkManager 41016 1727204189.49043: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204189.49046: getting variables 41016 1727204189.49047: in VariableManager get_vars() 41016 1727204189.49078: Calling all_inventory to load vars for managed-node1 41016 1727204189.49081: Calling groups_inventory to load vars for managed-node1 41016 1727204189.49083: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204189.49097: Calling all_plugins_play to load vars for managed-node1 41016 1727204189.49100: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204189.49107: Calling groups_plugins_play to load vars for managed-node1 41016 1727204189.49278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204189.49503: done with get_vars() 41016 1727204189.49516: done getting variables 41016 1727204189.49577: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Tuesday 24 September 2024 14:56:29 -0400 (0:00:01.200) 0:00:13.172 ***** 41016 1727204189.49608: entering _queue_task() for managed-node1/command 41016 1727204189.49974: worker is 1 (out of 1 available) 41016 1727204189.49988: exiting _queue_task() for managed-node1/command 41016 1727204189.49999: done queuing things up, now waiting for results queue to drain 41016 1727204189.50005: waiting for pending results... 41016 1727204189.50169: running TaskExecutor() for managed-node1/TASK: Set up veth as managed by NetworkManager 41016 1727204189.50239: in run() - task 028d2410-947f-12d5-0ec4-000000000301 41016 1727204189.50250: variable 'ansible_search_path' from source: unknown 41016 1727204189.50254: variable 'ansible_search_path' from source: unknown 41016 1727204189.50284: calling self._execute() 41016 1727204189.50355: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204189.50359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204189.50365: variable 'omit' from source: magic vars 41016 1727204189.50621: variable 'ansible_distribution_major_version' from source: facts 41016 1727204189.50630: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204189.50734: variable 'type' from source: set_fact 41016 1727204189.50737: variable 'state' from source: include params 41016 1727204189.50740: Evaluated conditional (type == 'veth' and state == 'present'): True 41016 1727204189.50747: variable 'omit' from source: magic vars 41016 1727204189.50773: variable 'omit' from source: magic vars 41016 1727204189.50842: variable 'interface' from source: set_fact 41016 1727204189.50855: variable 'omit' from source: magic vars 41016 1727204189.50889: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204189.50916: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204189.50931: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204189.50944: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204189.50954: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204189.50977: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204189.50980: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204189.50984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204189.51052: Set connection var ansible_shell_executable to /bin/sh 41016 1727204189.51055: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204189.51061: Set connection var ansible_shell_type to sh 41016 1727204189.51066: Set connection var ansible_timeout to 10 41016 1727204189.51071: Set connection var ansible_pipelining to False 41016 1727204189.51079: Set connection var ansible_connection to ssh 41016 1727204189.51095: variable 'ansible_shell_executable' from source: unknown 41016 1727204189.51098: variable 'ansible_connection' from source: unknown 41016 1727204189.51101: variable 'ansible_module_compression' from source: unknown 41016 1727204189.51104: variable 'ansible_shell_type' from source: unknown 41016 1727204189.51106: variable 'ansible_shell_executable' from source: unknown 41016 1727204189.51112: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204189.51115: variable 'ansible_pipelining' from source: unknown 41016 1727204189.51117: variable 'ansible_timeout' from source: unknown 41016 1727204189.51119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204189.51215: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204189.51223: variable 'omit' from source: magic vars 41016 1727204189.51235: starting attempt loop 41016 1727204189.51242: running the handler 41016 1727204189.51244: _low_level_execute_command(): starting 41016 1727204189.51253: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204189.51744: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204189.51748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204189.51751: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204189.51753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204189.51806: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204189.51809: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204189.51814: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204189.51901: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204189.53669: stdout chunk (state=3): >>>/root <<< 41016 1727204189.53770: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204189.53804: stderr chunk (state=3): >>><<< 41016 1727204189.53806: stdout chunk (state=3): >>><<< 41016 1727204189.53820: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204189.53847: _low_level_execute_command(): starting 41016 1727204189.53850: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204189.5382488-42444-165570839474688 `" && echo ansible-tmp-1727204189.5382488-42444-165570839474688="` echo /root/.ansible/tmp/ansible-tmp-1727204189.5382488-42444-165570839474688 `" ) && sleep 0' 41016 1727204189.54247: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204189.54258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204189.54271: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204189.54314: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204189.54327: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204189.54432: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204189.56499: stdout chunk (state=3): >>>ansible-tmp-1727204189.5382488-42444-165570839474688=/root/.ansible/tmp/ansible-tmp-1727204189.5382488-42444-165570839474688 <<< 41016 1727204189.56610: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204189.56637: stderr chunk (state=3): >>><<< 41016 1727204189.56640: stdout chunk (state=3): >>><<< 41016 1727204189.56654: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204189.5382488-42444-165570839474688=/root/.ansible/tmp/ansible-tmp-1727204189.5382488-42444-165570839474688 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204189.56680: variable 'ansible_module_compression' from source: unknown 41016 1727204189.56719: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-410168h8uvyln/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41016 1727204189.56752: variable 'ansible_facts' from source: unknown 41016 1727204189.56806: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204189.5382488-42444-165570839474688/AnsiballZ_command.py 41016 1727204189.56902: Sending initial data 41016 1727204189.56906: Sent initial data (156 bytes) 41016 1727204189.57451: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204189.57490: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204189.57610: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204189.59335: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204189.59410: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204189.59482: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmpuvr9cz1l /root/.ansible/tmp/ansible-tmp-1727204189.5382488-42444-165570839474688/AnsiballZ_command.py <<< 41016 1727204189.59489: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204189.5382488-42444-165570839474688/AnsiballZ_command.py" <<< 41016 1727204189.59553: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmpuvr9cz1l" to remote "/root/.ansible/tmp/ansible-tmp-1727204189.5382488-42444-165570839474688/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204189.5382488-42444-165570839474688/AnsiballZ_command.py" <<< 41016 1727204189.60207: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204189.60246: stderr chunk (state=3): >>><<< 41016 1727204189.60249: stdout chunk (state=3): >>><<< 41016 1727204189.60297: done transferring module to remote 41016 1727204189.60307: _low_level_execute_command(): starting 41016 1727204189.60314: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204189.5382488-42444-165570839474688/ /root/.ansible/tmp/ansible-tmp-1727204189.5382488-42444-165570839474688/AnsiballZ_command.py && sleep 0' 41016 1727204189.60730: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204189.60734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 41016 1727204189.60736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 41016 1727204189.60739: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204189.60781: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204189.60785: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204189.60865: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204189.62883: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204189.62886: stdout chunk (state=3): >>><<< 41016 1727204189.62894: stderr chunk (state=3): >>><<< 41016 1727204189.63011: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204189.63015: _low_level_execute_command(): starting 41016 1727204189.63018: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204189.5382488-42444-165570839474688/AnsiballZ_command.py && sleep 0' 41016 1727204189.63694: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204189.63748: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204189.63764: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204189.63788: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204189.63929: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204189.82617: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest1", "managed", "true"], "start": "2024-09-24 14:56:29.802713", "end": "2024-09-24 14:56:29.823398", "delta": "0:00:00.020685", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest1 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41016 1727204189.84397: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204189.84401: stderr chunk (state=3): >>>Shared connection to 10.31.14.47 closed. <<< 41016 1727204189.84403: stderr chunk (state=3): >>><<< 41016 1727204189.84405: stdout chunk (state=3): >>><<< 41016 1727204189.84417: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest1", "managed", "true"], "start": "2024-09-24 14:56:29.802713", "end": "2024-09-24 14:56:29.823398", "delta": "0:00:00.020685", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest1 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204189.84461: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set ethtest1 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204189.5382488-42444-165570839474688/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204189.84468: _low_level_execute_command(): starting 41016 1727204189.84474: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204189.5382488-42444-165570839474688/ > /dev/null 2>&1 && sleep 0' 41016 1727204189.85282: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204189.85286: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204189.85288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204189.85291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204189.85293: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204189.85295: stderr chunk (state=3): >>>debug2: match not found <<< 41016 1727204189.85297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204189.85298: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41016 1727204189.85305: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204189.85316: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204189.85336: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204189.85355: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204189.85558: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204189.87505: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204189.87519: stdout chunk (state=3): >>><<< 41016 1727204189.87550: stderr chunk (state=3): >>><<< 41016 1727204189.87581: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204189.87593: handler run complete 41016 1727204189.87623: Evaluated conditional (False): False 41016 1727204189.87657: attempt loop complete, returning result 41016 1727204189.87670: _execute() done 41016 1727204189.87680: dumping result to json 41016 1727204189.87755: done dumping result, returning 41016 1727204189.87758: done running TaskExecutor() for managed-node1/TASK: Set up veth as managed by NetworkManager [028d2410-947f-12d5-0ec4-000000000301] 41016 1727204189.87761: sending task result for task 028d2410-947f-12d5-0ec4-000000000301 41016 1727204189.87841: done sending task result for task 028d2410-947f-12d5-0ec4-000000000301 ok: [managed-node1] => { "changed": false, "cmd": [ "nmcli", "d", "set", "ethtest1", "managed", "true" ], "delta": "0:00:00.020685", "end": "2024-09-24 14:56:29.823398", "rc": 0, "start": "2024-09-24 14:56:29.802713" } 41016 1727204189.87915: no more pending results, returning what we have 41016 1727204189.87920: results queue empty 41016 1727204189.87921: checking for any_errors_fatal 41016 1727204189.87941: done checking for any_errors_fatal 41016 1727204189.87942: checking for max_fail_percentage 41016 1727204189.87944: done checking for max_fail_percentage 41016 1727204189.87945: checking to see if all hosts have failed and the running result is not ok 41016 1727204189.87946: done checking to see if all hosts have failed 41016 1727204189.87947: getting the remaining hosts for this loop 41016 1727204189.87948: done getting the remaining hosts for this loop 41016 1727204189.87954: getting the next task for host managed-node1 41016 1727204189.87962: done getting next task for host managed-node1 41016 1727204189.87965: ^ task is: TASK: Delete veth interface {{ interface }} 41016 1727204189.87968: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204189.87971: getting variables 41016 1727204189.87973: in VariableManager get_vars() 41016 1727204189.88021: Calling all_inventory to load vars for managed-node1 41016 1727204189.88024: Calling groups_inventory to load vars for managed-node1 41016 1727204189.88026: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204189.88035: Calling all_plugins_play to load vars for managed-node1 41016 1727204189.88038: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204189.88040: Calling groups_plugins_play to load vars for managed-node1 41016 1727204189.88606: WORKER PROCESS EXITING 41016 1727204189.88616: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204189.88840: done with get_vars() 41016 1727204189.88851: done getting variables 41016 1727204189.88916: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41016 1727204189.89048: variable 'interface' from source: set_fact TASK [Delete veth interface ethtest1] ****************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Tuesday 24 September 2024 14:56:29 -0400 (0:00:00.394) 0:00:13.566 ***** 41016 1727204189.89080: entering _queue_task() for managed-node1/command 41016 1727204189.89469: worker is 1 (out of 1 available) 41016 1727204189.89483: exiting _queue_task() for managed-node1/command 41016 1727204189.89497: done queuing things up, now waiting for results queue to drain 41016 1727204189.89498: waiting for pending results... 41016 1727204189.89703: running TaskExecutor() for managed-node1/TASK: Delete veth interface ethtest1 41016 1727204189.89835: in run() - task 028d2410-947f-12d5-0ec4-000000000302 41016 1727204189.89912: variable 'ansible_search_path' from source: unknown 41016 1727204189.89915: variable 'ansible_search_path' from source: unknown 41016 1727204189.89918: calling self._execute() 41016 1727204189.90112: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204189.90131: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204189.90146: variable 'omit' from source: magic vars 41016 1727204189.90537: variable 'ansible_distribution_major_version' from source: facts 41016 1727204189.90553: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204189.90768: variable 'type' from source: set_fact 41016 1727204189.90788: variable 'state' from source: include params 41016 1727204189.90797: variable 'interface' from source: set_fact 41016 1727204189.90893: variable 'current_interfaces' from source: set_fact 41016 1727204189.90897: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 41016 1727204189.90900: when evaluation is False, skipping this task 41016 1727204189.90902: _execute() done 41016 1727204189.90906: dumping result to json 41016 1727204189.90911: done dumping result, returning 41016 1727204189.90914: done running TaskExecutor() for managed-node1/TASK: Delete veth interface ethtest1 [028d2410-947f-12d5-0ec4-000000000302] 41016 1727204189.90916: sending task result for task 028d2410-947f-12d5-0ec4-000000000302 41016 1727204189.90984: done sending task result for task 028d2410-947f-12d5-0ec4-000000000302 41016 1727204189.90988: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 41016 1727204189.91046: no more pending results, returning what we have 41016 1727204189.91051: results queue empty 41016 1727204189.91052: checking for any_errors_fatal 41016 1727204189.91062: done checking for any_errors_fatal 41016 1727204189.91063: checking for max_fail_percentage 41016 1727204189.91064: done checking for max_fail_percentage 41016 1727204189.91065: checking to see if all hosts have failed and the running result is not ok 41016 1727204189.91066: done checking to see if all hosts have failed 41016 1727204189.91067: getting the remaining hosts for this loop 41016 1727204189.91068: done getting the remaining hosts for this loop 41016 1727204189.91072: getting the next task for host managed-node1 41016 1727204189.91081: done getting next task for host managed-node1 41016 1727204189.91085: ^ task is: TASK: Create dummy interface {{ interface }} 41016 1727204189.91088: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204189.91092: getting variables 41016 1727204189.91095: in VariableManager get_vars() 41016 1727204189.91147: Calling all_inventory to load vars for managed-node1 41016 1727204189.91150: Calling groups_inventory to load vars for managed-node1 41016 1727204189.91153: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204189.91166: Calling all_plugins_play to load vars for managed-node1 41016 1727204189.91169: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204189.91172: Calling groups_plugins_play to load vars for managed-node1 41016 1727204189.91644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204189.91851: done with get_vars() 41016 1727204189.91861: done getting variables 41016 1727204189.91927: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41016 1727204189.92040: variable 'interface' from source: set_fact TASK [Create dummy interface ethtest1] ***************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Tuesday 24 September 2024 14:56:29 -0400 (0:00:00.029) 0:00:13.596 ***** 41016 1727204189.92066: entering _queue_task() for managed-node1/command 41016 1727204189.92330: worker is 1 (out of 1 available) 41016 1727204189.92341: exiting _queue_task() for managed-node1/command 41016 1727204189.92464: done queuing things up, now waiting for results queue to drain 41016 1727204189.92466: waiting for pending results... 41016 1727204189.92696: running TaskExecutor() for managed-node1/TASK: Create dummy interface ethtest1 41016 1727204189.92793: in run() - task 028d2410-947f-12d5-0ec4-000000000303 41016 1727204189.92797: variable 'ansible_search_path' from source: unknown 41016 1727204189.92801: variable 'ansible_search_path' from source: unknown 41016 1727204189.92842: calling self._execute() 41016 1727204189.92982: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204189.92986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204189.92989: variable 'omit' from source: magic vars 41016 1727204189.93379: variable 'ansible_distribution_major_version' from source: facts 41016 1727204189.93397: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204189.93619: variable 'type' from source: set_fact 41016 1727204189.93630: variable 'state' from source: include params 41016 1727204189.93660: variable 'interface' from source: set_fact 41016 1727204189.93663: variable 'current_interfaces' from source: set_fact 41016 1727204189.93666: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 41016 1727204189.93669: when evaluation is False, skipping this task 41016 1727204189.93766: _execute() done 41016 1727204189.93771: dumping result to json 41016 1727204189.93774: done dumping result, returning 41016 1727204189.93777: done running TaskExecutor() for managed-node1/TASK: Create dummy interface ethtest1 [028d2410-947f-12d5-0ec4-000000000303] 41016 1727204189.93780: sending task result for task 028d2410-947f-12d5-0ec4-000000000303 41016 1727204189.93846: done sending task result for task 028d2410-947f-12d5-0ec4-000000000303 41016 1727204189.93850: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 41016 1727204189.93911: no more pending results, returning what we have 41016 1727204189.93916: results queue empty 41016 1727204189.93917: checking for any_errors_fatal 41016 1727204189.93925: done checking for any_errors_fatal 41016 1727204189.93926: checking for max_fail_percentage 41016 1727204189.93928: done checking for max_fail_percentage 41016 1727204189.93929: checking to see if all hosts have failed and the running result is not ok 41016 1727204189.93930: done checking to see if all hosts have failed 41016 1727204189.93930: getting the remaining hosts for this loop 41016 1727204189.93932: done getting the remaining hosts for this loop 41016 1727204189.93936: getting the next task for host managed-node1 41016 1727204189.93944: done getting next task for host managed-node1 41016 1727204189.93947: ^ task is: TASK: Delete dummy interface {{ interface }} 41016 1727204189.93951: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204189.93955: getting variables 41016 1727204189.93956: in VariableManager get_vars() 41016 1727204189.94116: Calling all_inventory to load vars for managed-node1 41016 1727204189.94120: Calling groups_inventory to load vars for managed-node1 41016 1727204189.94123: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204189.94134: Calling all_plugins_play to load vars for managed-node1 41016 1727204189.94138: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204189.94141: Calling groups_plugins_play to load vars for managed-node1 41016 1727204189.94433: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204189.94670: done with get_vars() 41016 1727204189.94681: done getting variables 41016 1727204189.94732: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41016 1727204189.94843: variable 'interface' from source: set_fact TASK [Delete dummy interface ethtest1] ***************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Tuesday 24 September 2024 14:56:29 -0400 (0:00:00.028) 0:00:13.624 ***** 41016 1727204189.94878: entering _queue_task() for managed-node1/command 41016 1727204189.95133: worker is 1 (out of 1 available) 41016 1727204189.95146: exiting _queue_task() for managed-node1/command 41016 1727204189.95158: done queuing things up, now waiting for results queue to drain 41016 1727204189.95159: waiting for pending results... 41016 1727204189.95519: running TaskExecutor() for managed-node1/TASK: Delete dummy interface ethtest1 41016 1727204189.95539: in run() - task 028d2410-947f-12d5-0ec4-000000000304 41016 1727204189.95581: variable 'ansible_search_path' from source: unknown 41016 1727204189.95585: variable 'ansible_search_path' from source: unknown 41016 1727204189.95607: calling self._execute() 41016 1727204189.95704: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204189.95725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204189.95781: variable 'omit' from source: magic vars 41016 1727204189.96140: variable 'ansible_distribution_major_version' from source: facts 41016 1727204189.96162: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204189.96383: variable 'type' from source: set_fact 41016 1727204189.96481: variable 'state' from source: include params 41016 1727204189.96489: variable 'interface' from source: set_fact 41016 1727204189.96492: variable 'current_interfaces' from source: set_fact 41016 1727204189.96496: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 41016 1727204189.96499: when evaluation is False, skipping this task 41016 1727204189.96501: _execute() done 41016 1727204189.96503: dumping result to json 41016 1727204189.96505: done dumping result, returning 41016 1727204189.96508: done running TaskExecutor() for managed-node1/TASK: Delete dummy interface ethtest1 [028d2410-947f-12d5-0ec4-000000000304] 41016 1727204189.96512: sending task result for task 028d2410-947f-12d5-0ec4-000000000304 41016 1727204189.96577: done sending task result for task 028d2410-947f-12d5-0ec4-000000000304 41016 1727204189.96581: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 41016 1727204189.96640: no more pending results, returning what we have 41016 1727204189.96645: results queue empty 41016 1727204189.96646: checking for any_errors_fatal 41016 1727204189.96652: done checking for any_errors_fatal 41016 1727204189.96652: checking for max_fail_percentage 41016 1727204189.96654: done checking for max_fail_percentage 41016 1727204189.96655: checking to see if all hosts have failed and the running result is not ok 41016 1727204189.96656: done checking to see if all hosts have failed 41016 1727204189.96657: getting the remaining hosts for this loop 41016 1727204189.96659: done getting the remaining hosts for this loop 41016 1727204189.96662: getting the next task for host managed-node1 41016 1727204189.96669: done getting next task for host managed-node1 41016 1727204189.96672: ^ task is: TASK: Create tap interface {{ interface }} 41016 1727204189.96675: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204189.96681: getting variables 41016 1727204189.96682: in VariableManager get_vars() 41016 1727204189.96806: Calling all_inventory to load vars for managed-node1 41016 1727204189.96925: Calling groups_inventory to load vars for managed-node1 41016 1727204189.96929: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204189.96939: Calling all_plugins_play to load vars for managed-node1 41016 1727204189.96942: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204189.96945: Calling groups_plugins_play to load vars for managed-node1 41016 1727204189.97172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204189.97382: done with get_vars() 41016 1727204189.97391: done getting variables 41016 1727204189.97453: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41016 1727204189.97571: variable 'interface' from source: set_fact TASK [Create tap interface ethtest1] ******************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Tuesday 24 September 2024 14:56:29 -0400 (0:00:00.027) 0:00:13.652 ***** 41016 1727204189.97605: entering _queue_task() for managed-node1/command 41016 1727204189.97858: worker is 1 (out of 1 available) 41016 1727204189.97870: exiting _queue_task() for managed-node1/command 41016 1727204189.98084: done queuing things up, now waiting for results queue to drain 41016 1727204189.98086: waiting for pending results... 41016 1727204189.98291: running TaskExecutor() for managed-node1/TASK: Create tap interface ethtest1 41016 1727204189.98295: in run() - task 028d2410-947f-12d5-0ec4-000000000305 41016 1727204189.98298: variable 'ansible_search_path' from source: unknown 41016 1727204189.98300: variable 'ansible_search_path' from source: unknown 41016 1727204189.98324: calling self._execute() 41016 1727204189.98417: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204189.98432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204189.98447: variable 'omit' from source: magic vars 41016 1727204189.98798: variable 'ansible_distribution_major_version' from source: facts 41016 1727204189.98807: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204189.98946: variable 'type' from source: set_fact 41016 1727204189.98949: variable 'state' from source: include params 41016 1727204189.98952: variable 'interface' from source: set_fact 41016 1727204189.98964: variable 'current_interfaces' from source: set_fact 41016 1727204189.98967: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 41016 1727204189.98970: when evaluation is False, skipping this task 41016 1727204189.98972: _execute() done 41016 1727204189.98974: dumping result to json 41016 1727204189.98979: done dumping result, returning 41016 1727204189.98985: done running TaskExecutor() for managed-node1/TASK: Create tap interface ethtest1 [028d2410-947f-12d5-0ec4-000000000305] 41016 1727204189.98990: sending task result for task 028d2410-947f-12d5-0ec4-000000000305 41016 1727204189.99063: done sending task result for task 028d2410-947f-12d5-0ec4-000000000305 skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 41016 1727204189.99117: no more pending results, returning what we have 41016 1727204189.99121: results queue empty 41016 1727204189.99122: checking for any_errors_fatal 41016 1727204189.99127: done checking for any_errors_fatal 41016 1727204189.99128: checking for max_fail_percentage 41016 1727204189.99130: done checking for max_fail_percentage 41016 1727204189.99131: checking to see if all hosts have failed and the running result is not ok 41016 1727204189.99132: done checking to see if all hosts have failed 41016 1727204189.99132: getting the remaining hosts for this loop 41016 1727204189.99134: done getting the remaining hosts for this loop 41016 1727204189.99137: getting the next task for host managed-node1 41016 1727204189.99142: done getting next task for host managed-node1 41016 1727204189.99146: ^ task is: TASK: Delete tap interface {{ interface }} 41016 1727204189.99148: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204189.99151: getting variables 41016 1727204189.99152: in VariableManager get_vars() 41016 1727204189.99193: Calling all_inventory to load vars for managed-node1 41016 1727204189.99196: Calling groups_inventory to load vars for managed-node1 41016 1727204189.99198: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204189.99203: WORKER PROCESS EXITING 41016 1727204189.99211: Calling all_plugins_play to load vars for managed-node1 41016 1727204189.99213: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204189.99216: Calling groups_plugins_play to load vars for managed-node1 41016 1727204189.99363: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204189.99481: done with get_vars() 41016 1727204189.99487: done getting variables 41016 1727204189.99529: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41016 1727204189.99601: variable 'interface' from source: set_fact TASK [Delete tap interface ethtest1] ******************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Tuesday 24 September 2024 14:56:29 -0400 (0:00:00.020) 0:00:13.672 ***** 41016 1727204189.99623: entering _queue_task() for managed-node1/command 41016 1727204189.99799: worker is 1 (out of 1 available) 41016 1727204189.99812: exiting _queue_task() for managed-node1/command 41016 1727204189.99823: done queuing things up, now waiting for results queue to drain 41016 1727204189.99824: waiting for pending results... 41016 1727204189.99979: running TaskExecutor() for managed-node1/TASK: Delete tap interface ethtest1 41016 1727204190.00044: in run() - task 028d2410-947f-12d5-0ec4-000000000306 41016 1727204190.00054: variable 'ansible_search_path' from source: unknown 41016 1727204190.00062: variable 'ansible_search_path' from source: unknown 41016 1727204190.00092: calling self._execute() 41016 1727204190.00155: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204190.00161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204190.00171: variable 'omit' from source: magic vars 41016 1727204190.00581: variable 'ansible_distribution_major_version' from source: facts 41016 1727204190.00584: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204190.00692: variable 'type' from source: set_fact 41016 1727204190.00703: variable 'state' from source: include params 41016 1727204190.00715: variable 'interface' from source: set_fact 41016 1727204190.00725: variable 'current_interfaces' from source: set_fact 41016 1727204190.00739: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 41016 1727204190.00747: when evaluation is False, skipping this task 41016 1727204190.00755: _execute() done 41016 1727204190.00762: dumping result to json 41016 1727204190.00770: done dumping result, returning 41016 1727204190.00783: done running TaskExecutor() for managed-node1/TASK: Delete tap interface ethtest1 [028d2410-947f-12d5-0ec4-000000000306] 41016 1727204190.00792: sending task result for task 028d2410-947f-12d5-0ec4-000000000306 skipping: [managed-node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 41016 1727204190.00938: no more pending results, returning what we have 41016 1727204190.00943: results queue empty 41016 1727204190.00944: checking for any_errors_fatal 41016 1727204190.00951: done checking for any_errors_fatal 41016 1727204190.00952: checking for max_fail_percentage 41016 1727204190.00953: done checking for max_fail_percentage 41016 1727204190.00954: checking to see if all hosts have failed and the running result is not ok 41016 1727204190.00955: done checking to see if all hosts have failed 41016 1727204190.00956: getting the remaining hosts for this loop 41016 1727204190.00957: done getting the remaining hosts for this loop 41016 1727204190.00961: getting the next task for host managed-node1 41016 1727204190.00970: done getting next task for host managed-node1 41016 1727204190.00974: ^ task is: TASK: Assert device is present 41016 1727204190.00979: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204190.00984: getting variables 41016 1727204190.00985: in VariableManager get_vars() 41016 1727204190.01031: Calling all_inventory to load vars for managed-node1 41016 1727204190.01034: Calling groups_inventory to load vars for managed-node1 41016 1727204190.01037: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204190.01050: Calling all_plugins_play to load vars for managed-node1 41016 1727204190.01053: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204190.01056: Calling groups_plugins_play to load vars for managed-node1 41016 1727204190.01512: done sending task result for task 028d2410-947f-12d5-0ec4-000000000306 41016 1727204190.01515: WORKER PROCESS EXITING 41016 1727204190.01554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204190.01682: done with get_vars() 41016 1727204190.01690: done getting variables TASK [Assert device is present] ************************************************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:32 Tuesday 24 September 2024 14:56:30 -0400 (0:00:00.021) 0:00:13.693 ***** 41016 1727204190.01756: entering _queue_task() for managed-node1/include_tasks 41016 1727204190.01967: worker is 1 (out of 1 available) 41016 1727204190.01981: exiting _queue_task() for managed-node1/include_tasks 41016 1727204190.01993: done queuing things up, now waiting for results queue to drain 41016 1727204190.01994: waiting for pending results... 41016 1727204190.02159: running TaskExecutor() for managed-node1/TASK: Assert device is present 41016 1727204190.02223: in run() - task 028d2410-947f-12d5-0ec4-000000000012 41016 1727204190.02234: variable 'ansible_search_path' from source: unknown 41016 1727204190.02262: calling self._execute() 41016 1727204190.02326: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204190.02330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204190.02342: variable 'omit' from source: magic vars 41016 1727204190.02600: variable 'ansible_distribution_major_version' from source: facts 41016 1727204190.02611: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204190.02614: _execute() done 41016 1727204190.02617: dumping result to json 41016 1727204190.02619: done dumping result, returning 41016 1727204190.02627: done running TaskExecutor() for managed-node1/TASK: Assert device is present [028d2410-947f-12d5-0ec4-000000000012] 41016 1727204190.02631: sending task result for task 028d2410-947f-12d5-0ec4-000000000012 41016 1727204190.02718: done sending task result for task 028d2410-947f-12d5-0ec4-000000000012 41016 1727204190.02720: WORKER PROCESS EXITING 41016 1727204190.02745: no more pending results, returning what we have 41016 1727204190.02750: in VariableManager get_vars() 41016 1727204190.02794: Calling all_inventory to load vars for managed-node1 41016 1727204190.02797: Calling groups_inventory to load vars for managed-node1 41016 1727204190.02800: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204190.02812: Calling all_plugins_play to load vars for managed-node1 41016 1727204190.02815: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204190.02817: Calling groups_plugins_play to load vars for managed-node1 41016 1727204190.02990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204190.03102: done with get_vars() 41016 1727204190.03107: variable 'ansible_search_path' from source: unknown 41016 1727204190.03117: we have included files to process 41016 1727204190.03117: generating all_blocks data 41016 1727204190.03118: done generating all_blocks data 41016 1727204190.03121: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 41016 1727204190.03122: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 41016 1727204190.03123: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 41016 1727204190.03190: in VariableManager get_vars() 41016 1727204190.03207: done with get_vars() 41016 1727204190.03280: done processing included file 41016 1727204190.03282: iterating over new_blocks loaded from include file 41016 1727204190.03283: in VariableManager get_vars() 41016 1727204190.03293: done with get_vars() 41016 1727204190.03294: filtering new block on tags 41016 1727204190.03305: done filtering new block on tags 41016 1727204190.03306: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed-node1 41016 1727204190.03310: extending task lists for all hosts with included blocks 41016 1727204190.04542: done extending task lists 41016 1727204190.04543: done processing included files 41016 1727204190.04544: results queue empty 41016 1727204190.04545: checking for any_errors_fatal 41016 1727204190.04548: done checking for any_errors_fatal 41016 1727204190.04548: checking for max_fail_percentage 41016 1727204190.04549: done checking for max_fail_percentage 41016 1727204190.04550: checking to see if all hosts have failed and the running result is not ok 41016 1727204190.04551: done checking to see if all hosts have failed 41016 1727204190.04552: getting the remaining hosts for this loop 41016 1727204190.04553: done getting the remaining hosts for this loop 41016 1727204190.04555: getting the next task for host managed-node1 41016 1727204190.04559: done getting next task for host managed-node1 41016 1727204190.04562: ^ task is: TASK: Include the task 'get_interface_stat.yml' 41016 1727204190.04564: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204190.04567: getting variables 41016 1727204190.04568: in VariableManager get_vars() 41016 1727204190.04585: Calling all_inventory to load vars for managed-node1 41016 1727204190.04587: Calling groups_inventory to load vars for managed-node1 41016 1727204190.04589: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204190.04595: Calling all_plugins_play to load vars for managed-node1 41016 1727204190.04598: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204190.04600: Calling groups_plugins_play to load vars for managed-node1 41016 1727204190.04779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204190.04973: done with get_vars() 41016 1727204190.04985: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:56:30 -0400 (0:00:00.033) 0:00:13.726 ***** 41016 1727204190.05062: entering _queue_task() for managed-node1/include_tasks 41016 1727204190.05375: worker is 1 (out of 1 available) 41016 1727204190.05390: exiting _queue_task() for managed-node1/include_tasks 41016 1727204190.05405: done queuing things up, now waiting for results queue to drain 41016 1727204190.05406: waiting for pending results... 41016 1727204190.05794: running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' 41016 1727204190.05838: in run() - task 028d2410-947f-12d5-0ec4-0000000003eb 41016 1727204190.05857: variable 'ansible_search_path' from source: unknown 41016 1727204190.05935: variable 'ansible_search_path' from source: unknown 41016 1727204190.05939: calling self._execute() 41016 1727204190.05997: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204190.06011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204190.06026: variable 'omit' from source: magic vars 41016 1727204190.06413: variable 'ansible_distribution_major_version' from source: facts 41016 1727204190.06430: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204190.06441: _execute() done 41016 1727204190.06449: dumping result to json 41016 1727204190.06457: done dumping result, returning 41016 1727204190.06466: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' [028d2410-947f-12d5-0ec4-0000000003eb] 41016 1727204190.06483: sending task result for task 028d2410-947f-12d5-0ec4-0000000003eb 41016 1727204190.06612: no more pending results, returning what we have 41016 1727204190.06619: in VariableManager get_vars() 41016 1727204190.06674: Calling all_inventory to load vars for managed-node1 41016 1727204190.06679: Calling groups_inventory to load vars for managed-node1 41016 1727204190.06682: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204190.06697: Calling all_plugins_play to load vars for managed-node1 41016 1727204190.06700: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204190.06704: Calling groups_plugins_play to load vars for managed-node1 41016 1727204190.07171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204190.07572: done with get_vars() 41016 1727204190.07583: variable 'ansible_search_path' from source: unknown 41016 1727204190.07584: variable 'ansible_search_path' from source: unknown 41016 1727204190.07687: done sending task result for task 028d2410-947f-12d5-0ec4-0000000003eb 41016 1727204190.07690: WORKER PROCESS EXITING 41016 1727204190.07721: we have included files to process 41016 1727204190.07723: generating all_blocks data 41016 1727204190.07724: done generating all_blocks data 41016 1727204190.07725: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 41016 1727204190.07726: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 41016 1727204190.07729: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 41016 1727204190.08014: done processing included file 41016 1727204190.08016: iterating over new_blocks loaded from include file 41016 1727204190.08018: in VariableManager get_vars() 41016 1727204190.08037: done with get_vars() 41016 1727204190.08038: filtering new block on tags 41016 1727204190.08053: done filtering new block on tags 41016 1727204190.08056: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node1 41016 1727204190.08296: extending task lists for all hosts with included blocks 41016 1727204190.08403: done extending task lists 41016 1727204190.08404: done processing included files 41016 1727204190.08405: results queue empty 41016 1727204190.08405: checking for any_errors_fatal 41016 1727204190.08411: done checking for any_errors_fatal 41016 1727204190.08412: checking for max_fail_percentage 41016 1727204190.08413: done checking for max_fail_percentage 41016 1727204190.08414: checking to see if all hosts have failed and the running result is not ok 41016 1727204190.08415: done checking to see if all hosts have failed 41016 1727204190.08415: getting the remaining hosts for this loop 41016 1727204190.08416: done getting the remaining hosts for this loop 41016 1727204190.08419: getting the next task for host managed-node1 41016 1727204190.08423: done getting next task for host managed-node1 41016 1727204190.08425: ^ task is: TASK: Get stat for interface {{ interface }} 41016 1727204190.08428: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204190.08431: getting variables 41016 1727204190.08432: in VariableManager get_vars() 41016 1727204190.08446: Calling all_inventory to load vars for managed-node1 41016 1727204190.08448: Calling groups_inventory to load vars for managed-node1 41016 1727204190.08450: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204190.08455: Calling all_plugins_play to load vars for managed-node1 41016 1727204190.08458: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204190.08460: Calling groups_plugins_play to load vars for managed-node1 41016 1727204190.08692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204190.08897: done with get_vars() 41016 1727204190.08907: done getting variables 41016 1727204190.09065: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest1] ***************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:56:30 -0400 (0:00:00.040) 0:00:13.767 ***** 41016 1727204190.09097: entering _queue_task() for managed-node1/stat 41016 1727204190.09415: worker is 1 (out of 1 available) 41016 1727204190.09428: exiting _queue_task() for managed-node1/stat 41016 1727204190.09440: done queuing things up, now waiting for results queue to drain 41016 1727204190.09442: waiting for pending results... 41016 1727204190.09658: running TaskExecutor() for managed-node1/TASK: Get stat for interface ethtest1 41016 1727204190.09783: in run() - task 028d2410-947f-12d5-0ec4-000000000483 41016 1727204190.09808: variable 'ansible_search_path' from source: unknown 41016 1727204190.09816: variable 'ansible_search_path' from source: unknown 41016 1727204190.09856: calling self._execute() 41016 1727204190.09951: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204190.09963: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204190.09982: variable 'omit' from source: magic vars 41016 1727204190.10353: variable 'ansible_distribution_major_version' from source: facts 41016 1727204190.10371: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204190.10386: variable 'omit' from source: magic vars 41016 1727204190.10448: variable 'omit' from source: magic vars 41016 1727204190.10668: variable 'interface' from source: set_fact 41016 1727204190.10726: variable 'omit' from source: magic vars 41016 1727204190.10983: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204190.10987: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204190.10990: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204190.11011: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204190.11028: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204190.11593: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204190.11596: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204190.11599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204190.11601: Set connection var ansible_shell_executable to /bin/sh 41016 1727204190.11603: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204190.11605: Set connection var ansible_shell_type to sh 41016 1727204190.11606: Set connection var ansible_timeout to 10 41016 1727204190.11608: Set connection var ansible_pipelining to False 41016 1727204190.11610: Set connection var ansible_connection to ssh 41016 1727204190.11612: variable 'ansible_shell_executable' from source: unknown 41016 1727204190.11614: variable 'ansible_connection' from source: unknown 41016 1727204190.11616: variable 'ansible_module_compression' from source: unknown 41016 1727204190.11618: variable 'ansible_shell_type' from source: unknown 41016 1727204190.11621: variable 'ansible_shell_executable' from source: unknown 41016 1727204190.11623: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204190.11625: variable 'ansible_pipelining' from source: unknown 41016 1727204190.11627: variable 'ansible_timeout' from source: unknown 41016 1727204190.11629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204190.12360: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 41016 1727204190.12527: variable 'omit' from source: magic vars 41016 1727204190.12539: starting attempt loop 41016 1727204190.12546: running the handler 41016 1727204190.12564: _low_level_execute_command(): starting 41016 1727204190.12578: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204190.13993: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204190.14014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204190.14273: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204190.14295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204190.16190: stdout chunk (state=3): >>>/root <<< 41016 1727204190.16288: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204190.16326: stderr chunk (state=3): >>><<< 41016 1727204190.16336: stdout chunk (state=3): >>><<< 41016 1727204190.16537: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204190.16541: _low_level_execute_command(): starting 41016 1727204190.16545: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204190.164536-42481-104900418065542 `" && echo ansible-tmp-1727204190.164536-42481-104900418065542="` echo /root/.ansible/tmp/ansible-tmp-1727204190.164536-42481-104900418065542 `" ) && sleep 0' 41016 1727204190.17730: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204190.17821: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204190.17845: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204190.17970: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204190.18087: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204190.20217: stdout chunk (state=3): >>>ansible-tmp-1727204190.164536-42481-104900418065542=/root/.ansible/tmp/ansible-tmp-1727204190.164536-42481-104900418065542 <<< 41016 1727204190.20654: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204190.20658: stdout chunk (state=3): >>><<< 41016 1727204190.20661: stderr chunk (state=3): >>><<< 41016 1727204190.20664: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204190.164536-42481-104900418065542=/root/.ansible/tmp/ansible-tmp-1727204190.164536-42481-104900418065542 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204190.20667: variable 'ansible_module_compression' from source: unknown 41016 1727204190.20683: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-410168h8uvyln/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 41016 1727204190.20726: variable 'ansible_facts' from source: unknown 41016 1727204190.21004: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204190.164536-42481-104900418065542/AnsiballZ_stat.py 41016 1727204190.21242: Sending initial data 41016 1727204190.21251: Sent initial data (152 bytes) 41016 1727204190.22548: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204190.22558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204190.22574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204190.22741: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204190.22894: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204190.22967: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204190.24731: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204190.24821: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204190.24962: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmp9wdgo445 /root/.ansible/tmp/ansible-tmp-1727204190.164536-42481-104900418065542/AnsiballZ_stat.py <<< 41016 1727204190.24972: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204190.164536-42481-104900418065542/AnsiballZ_stat.py" <<< 41016 1727204190.25049: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmp9wdgo445" to remote "/root/.ansible/tmp/ansible-tmp-1727204190.164536-42481-104900418065542/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204190.164536-42481-104900418065542/AnsiballZ_stat.py" <<< 41016 1727204190.25966: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204190.26046: stdout chunk (state=3): >>><<< 41016 1727204190.26049: stderr chunk (state=3): >>><<< 41016 1727204190.26067: done transferring module to remote 41016 1727204190.26085: _low_level_execute_command(): starting 41016 1727204190.26095: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204190.164536-42481-104900418065542/ /root/.ansible/tmp/ansible-tmp-1727204190.164536-42481-104900418065542/AnsiballZ_stat.py && sleep 0' 41016 1727204190.26738: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204190.26752: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204190.26788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204190.26893: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204190.26942: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204190.27130: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204190.28956: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204190.29009: stderr chunk (state=3): >>><<< 41016 1727204190.29030: stdout chunk (state=3): >>><<< 41016 1727204190.29053: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204190.29061: _low_level_execute_command(): starting 41016 1727204190.29070: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204190.164536-42481-104900418065542/AnsiballZ_stat.py && sleep 0' 41016 1727204190.29641: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204190.29654: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204190.29667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204190.29686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204190.29705: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204190.29722: stderr chunk (state=3): >>>debug2: match not found <<< 41016 1727204190.29735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204190.29755: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41016 1727204190.29836: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204190.29852: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204190.29868: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204190.29895: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204190.30015: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204190.46316: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 30614, "dev": 23, "nlink": 1, "atime": 1727204188.6388657, "mtime": 1727204188.6388657, "ctime": 1727204188.6388657, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest1", "lnk_target": "../../devices/virtual/net/ethtest1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 41016 1727204190.47897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 41016 1727204190.47945: stderr chunk (state=3): >>><<< 41016 1727204190.47952: stdout chunk (state=3): >>><<< 41016 1727204190.47973: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 30614, "dev": 23, "nlink": 1, "atime": 1727204188.6388657, "mtime": 1727204188.6388657, "ctime": 1727204188.6388657, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest1", "lnk_target": "../../devices/virtual/net/ethtest1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204190.48039: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204190.164536-42481-104900418065542/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204190.48056: _low_level_execute_command(): starting 41016 1727204190.48065: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204190.164536-42481-104900418065542/ > /dev/null 2>&1 && sleep 0' 41016 1727204190.49069: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204190.49088: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204190.49113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204190.49144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204190.49162: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204190.49174: stderr chunk (state=3): >>>debug2: match not found <<< 41016 1727204190.49258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204190.49311: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204190.49327: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204190.49446: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204190.51413: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204190.51445: stderr chunk (state=3): >>><<< 41016 1727204190.51454: stdout chunk (state=3): >>><<< 41016 1727204190.51481: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204190.51509: handler run complete 41016 1727204190.51551: attempt loop complete, returning result 41016 1727204190.51617: _execute() done 41016 1727204190.51620: dumping result to json 41016 1727204190.51623: done dumping result, returning 41016 1727204190.51625: done running TaskExecutor() for managed-node1/TASK: Get stat for interface ethtest1 [028d2410-947f-12d5-0ec4-000000000483] 41016 1727204190.51627: sending task result for task 028d2410-947f-12d5-0ec4-000000000483 ok: [managed-node1] => { "changed": false, "stat": { "atime": 1727204188.6388657, "block_size": 4096, "blocks": 0, "ctime": 1727204188.6388657, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 30614, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/ethtest1", "lnk_target": "../../devices/virtual/net/ethtest1", "mode": "0777", "mtime": 1727204188.6388657, "nlink": 1, "path": "/sys/class/net/ethtest1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 41016 1727204190.51797: no more pending results, returning what we have 41016 1727204190.51800: results queue empty 41016 1727204190.51801: checking for any_errors_fatal 41016 1727204190.51803: done checking for any_errors_fatal 41016 1727204190.51803: checking for max_fail_percentage 41016 1727204190.51805: done checking for max_fail_percentage 41016 1727204190.51806: checking to see if all hosts have failed and the running result is not ok 41016 1727204190.51806: done checking to see if all hosts have failed 41016 1727204190.51807: getting the remaining hosts for this loop 41016 1727204190.51810: done getting the remaining hosts for this loop 41016 1727204190.51814: getting the next task for host managed-node1 41016 1727204190.51822: done getting next task for host managed-node1 41016 1727204190.51824: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 41016 1727204190.51827: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204190.51830: getting variables 41016 1727204190.51831: in VariableManager get_vars() 41016 1727204190.51870: Calling all_inventory to load vars for managed-node1 41016 1727204190.51873: Calling groups_inventory to load vars for managed-node1 41016 1727204190.52046: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204190.52054: done sending task result for task 028d2410-947f-12d5-0ec4-000000000483 41016 1727204190.52056: WORKER PROCESS EXITING 41016 1727204190.52066: Calling all_plugins_play to load vars for managed-node1 41016 1727204190.52068: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204190.52071: Calling groups_plugins_play to load vars for managed-node1 41016 1727204190.52523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204190.52729: done with get_vars() 41016 1727204190.52740: done getting variables 41016 1727204190.52802: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41016 1727204190.52919: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'ethtest1'] *********************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:56:30 -0400 (0:00:00.438) 0:00:14.205 ***** 41016 1727204190.52946: entering _queue_task() for managed-node1/assert 41016 1727204190.53403: worker is 1 (out of 1 available) 41016 1727204190.53412: exiting _queue_task() for managed-node1/assert 41016 1727204190.53421: done queuing things up, now waiting for results queue to drain 41016 1727204190.53422: waiting for pending results... 41016 1727204190.53520: running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'ethtest1' 41016 1727204190.53650: in run() - task 028d2410-947f-12d5-0ec4-0000000003ec 41016 1727204190.53654: variable 'ansible_search_path' from source: unknown 41016 1727204190.53657: variable 'ansible_search_path' from source: unknown 41016 1727204190.53759: calling self._execute() 41016 1727204190.53790: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204190.53800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204190.53814: variable 'omit' from source: magic vars 41016 1727204190.54194: variable 'ansible_distribution_major_version' from source: facts 41016 1727204190.54212: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204190.54222: variable 'omit' from source: magic vars 41016 1727204190.54261: variable 'omit' from source: magic vars 41016 1727204190.54369: variable 'interface' from source: set_fact 41016 1727204190.54395: variable 'omit' from source: magic vars 41016 1727204190.54447: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204190.54489: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204190.54628: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204190.54631: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204190.54633: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204190.54634: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204190.54636: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204190.54638: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204190.54693: Set connection var ansible_shell_executable to /bin/sh 41016 1727204190.54704: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204190.54713: Set connection var ansible_shell_type to sh 41016 1727204190.54720: Set connection var ansible_timeout to 10 41016 1727204190.54727: Set connection var ansible_pipelining to False 41016 1727204190.54744: Set connection var ansible_connection to ssh 41016 1727204190.54766: variable 'ansible_shell_executable' from source: unknown 41016 1727204190.54772: variable 'ansible_connection' from source: unknown 41016 1727204190.54847: variable 'ansible_module_compression' from source: unknown 41016 1727204190.54851: variable 'ansible_shell_type' from source: unknown 41016 1727204190.54853: variable 'ansible_shell_executable' from source: unknown 41016 1727204190.54855: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204190.54858: variable 'ansible_pipelining' from source: unknown 41016 1727204190.54860: variable 'ansible_timeout' from source: unknown 41016 1727204190.54863: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204190.54960: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204190.54981: variable 'omit' from source: magic vars 41016 1727204190.54994: starting attempt loop 41016 1727204190.55000: running the handler 41016 1727204190.55140: variable 'interface_stat' from source: set_fact 41016 1727204190.55170: Evaluated conditional (interface_stat.stat.exists): True 41016 1727204190.55188: handler run complete 41016 1727204190.55285: attempt loop complete, returning result 41016 1727204190.55289: _execute() done 41016 1727204190.55292: dumping result to json 41016 1727204190.55294: done dumping result, returning 41016 1727204190.55296: done running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'ethtest1' [028d2410-947f-12d5-0ec4-0000000003ec] 41016 1727204190.55299: sending task result for task 028d2410-947f-12d5-0ec4-0000000003ec 41016 1727204190.55361: done sending task result for task 028d2410-947f-12d5-0ec4-0000000003ec 41016 1727204190.55364: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 41016 1727204190.55438: no more pending results, returning what we have 41016 1727204190.55442: results queue empty 41016 1727204190.55444: checking for any_errors_fatal 41016 1727204190.55454: done checking for any_errors_fatal 41016 1727204190.55455: checking for max_fail_percentage 41016 1727204190.55457: done checking for max_fail_percentage 41016 1727204190.55458: checking to see if all hosts have failed and the running result is not ok 41016 1727204190.55459: done checking to see if all hosts have failed 41016 1727204190.55460: getting the remaining hosts for this loop 41016 1727204190.55461: done getting the remaining hosts for this loop 41016 1727204190.55465: getting the next task for host managed-node1 41016 1727204190.55480: done getting next task for host managed-node1 41016 1727204190.55487: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 41016 1727204190.55490: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204190.55517: getting variables 41016 1727204190.55519: in VariableManager get_vars() 41016 1727204190.55566: Calling all_inventory to load vars for managed-node1 41016 1727204190.55570: Calling groups_inventory to load vars for managed-node1 41016 1727204190.55573: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204190.55788: Calling all_plugins_play to load vars for managed-node1 41016 1727204190.55793: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204190.55797: Calling groups_plugins_play to load vars for managed-node1 41016 1727204190.55974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204190.56250: done with get_vars() 41016 1727204190.56261: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:56:30 -0400 (0:00:00.034) 0:00:14.239 ***** 41016 1727204190.56362: entering _queue_task() for managed-node1/include_tasks 41016 1727204190.56625: worker is 1 (out of 1 available) 41016 1727204190.56637: exiting _queue_task() for managed-node1/include_tasks 41016 1727204190.56764: done queuing things up, now waiting for results queue to drain 41016 1727204190.56765: waiting for pending results... 41016 1727204190.56995: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 41016 1727204190.57092: in run() - task 028d2410-947f-12d5-0ec4-00000000001b 41016 1727204190.57182: variable 'ansible_search_path' from source: unknown 41016 1727204190.57186: variable 'ansible_search_path' from source: unknown 41016 1727204190.57188: calling self._execute() 41016 1727204190.57250: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204190.57263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204190.57279: variable 'omit' from source: magic vars 41016 1727204190.57652: variable 'ansible_distribution_major_version' from source: facts 41016 1727204190.57667: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204190.57677: _execute() done 41016 1727204190.57685: dumping result to json 41016 1727204190.57692: done dumping result, returning 41016 1727204190.57701: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [028d2410-947f-12d5-0ec4-00000000001b] 41016 1727204190.57711: sending task result for task 028d2410-947f-12d5-0ec4-00000000001b 41016 1727204190.57900: done sending task result for task 028d2410-947f-12d5-0ec4-00000000001b 41016 1727204190.57904: WORKER PROCESS EXITING 41016 1727204190.57943: no more pending results, returning what we have 41016 1727204190.57948: in VariableManager get_vars() 41016 1727204190.58106: Calling all_inventory to load vars for managed-node1 41016 1727204190.58109: Calling groups_inventory to load vars for managed-node1 41016 1727204190.58111: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204190.58119: Calling all_plugins_play to load vars for managed-node1 41016 1727204190.58122: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204190.58125: Calling groups_plugins_play to load vars for managed-node1 41016 1727204190.58388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204190.58587: done with get_vars() 41016 1727204190.58595: variable 'ansible_search_path' from source: unknown 41016 1727204190.58596: variable 'ansible_search_path' from source: unknown 41016 1727204190.58639: we have included files to process 41016 1727204190.58640: generating all_blocks data 41016 1727204190.58642: done generating all_blocks data 41016 1727204190.58646: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41016 1727204190.58647: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41016 1727204190.58649: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41016 1727204190.59359: done processing included file 41016 1727204190.59361: iterating over new_blocks loaded from include file 41016 1727204190.59362: in VariableManager get_vars() 41016 1727204190.59392: done with get_vars() 41016 1727204190.59394: filtering new block on tags 41016 1727204190.59411: done filtering new block on tags 41016 1727204190.59414: in VariableManager get_vars() 41016 1727204190.59437: done with get_vars() 41016 1727204190.59439: filtering new block on tags 41016 1727204190.59459: done filtering new block on tags 41016 1727204190.59461: in VariableManager get_vars() 41016 1727204190.59489: done with get_vars() 41016 1727204190.59491: filtering new block on tags 41016 1727204190.59510: done filtering new block on tags 41016 1727204190.59512: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node1 41016 1727204190.59517: extending task lists for all hosts with included blocks 41016 1727204190.60393: done extending task lists 41016 1727204190.60394: done processing included files 41016 1727204190.60395: results queue empty 41016 1727204190.60396: checking for any_errors_fatal 41016 1727204190.60399: done checking for any_errors_fatal 41016 1727204190.60400: checking for max_fail_percentage 41016 1727204190.60401: done checking for max_fail_percentage 41016 1727204190.60402: checking to see if all hosts have failed and the running result is not ok 41016 1727204190.60403: done checking to see if all hosts have failed 41016 1727204190.60403: getting the remaining hosts for this loop 41016 1727204190.60405: done getting the remaining hosts for this loop 41016 1727204190.60407: getting the next task for host managed-node1 41016 1727204190.60411: done getting next task for host managed-node1 41016 1727204190.60414: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 41016 1727204190.60417: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204190.60426: getting variables 41016 1727204190.60427: in VariableManager get_vars() 41016 1727204190.60443: Calling all_inventory to load vars for managed-node1 41016 1727204190.60445: Calling groups_inventory to load vars for managed-node1 41016 1727204190.60447: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204190.60452: Calling all_plugins_play to load vars for managed-node1 41016 1727204190.60460: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204190.60463: Calling groups_plugins_play to load vars for managed-node1 41016 1727204190.60631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204190.60830: done with get_vars() 41016 1727204190.60839: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:56:30 -0400 (0:00:00.045) 0:00:14.285 ***** 41016 1727204190.60913: entering _queue_task() for managed-node1/setup 41016 1727204190.61385: worker is 1 (out of 1 available) 41016 1727204190.61394: exiting _queue_task() for managed-node1/setup 41016 1727204190.61403: done queuing things up, now waiting for results queue to drain 41016 1727204190.61404: waiting for pending results... 41016 1727204190.61537: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 41016 1727204190.61741: in run() - task 028d2410-947f-12d5-0ec4-00000000049b 41016 1727204190.61744: variable 'ansible_search_path' from source: unknown 41016 1727204190.61747: variable 'ansible_search_path' from source: unknown 41016 1727204190.61754: calling self._execute() 41016 1727204190.61833: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204190.61849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204190.61862: variable 'omit' from source: magic vars 41016 1727204190.62240: variable 'ansible_distribution_major_version' from source: facts 41016 1727204190.62256: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204190.62480: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41016 1727204190.64785: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41016 1727204190.64802: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41016 1727204190.64844: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41016 1727204190.64890: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41016 1727204190.64922: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41016 1727204190.65005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204190.65282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204190.65286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204190.65288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204190.65291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204190.65293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204190.65295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204190.65297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204190.65299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204190.65301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204190.65440: variable '__network_required_facts' from source: role '' defaults 41016 1727204190.65455: variable 'ansible_facts' from source: unknown 41016 1727204190.65557: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 41016 1727204190.65566: when evaluation is False, skipping this task 41016 1727204190.65573: _execute() done 41016 1727204190.65582: dumping result to json 41016 1727204190.65591: done dumping result, returning 41016 1727204190.65602: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [028d2410-947f-12d5-0ec4-00000000049b] 41016 1727204190.65611: sending task result for task 028d2410-947f-12d5-0ec4-00000000049b skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41016 1727204190.65792: no more pending results, returning what we have 41016 1727204190.65797: results queue empty 41016 1727204190.65798: checking for any_errors_fatal 41016 1727204190.65800: done checking for any_errors_fatal 41016 1727204190.65801: checking for max_fail_percentage 41016 1727204190.65803: done checking for max_fail_percentage 41016 1727204190.65804: checking to see if all hosts have failed and the running result is not ok 41016 1727204190.65804: done checking to see if all hosts have failed 41016 1727204190.65805: getting the remaining hosts for this loop 41016 1727204190.65807: done getting the remaining hosts for this loop 41016 1727204190.65811: getting the next task for host managed-node1 41016 1727204190.65821: done getting next task for host managed-node1 41016 1727204190.65826: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 41016 1727204190.65830: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204190.65844: getting variables 41016 1727204190.65848: in VariableManager get_vars() 41016 1727204190.65899: Calling all_inventory to load vars for managed-node1 41016 1727204190.65903: Calling groups_inventory to load vars for managed-node1 41016 1727204190.65906: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204190.65916: Calling all_plugins_play to load vars for managed-node1 41016 1727204190.65920: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204190.65923: Calling groups_plugins_play to load vars for managed-node1 41016 1727204190.66319: done sending task result for task 028d2410-947f-12d5-0ec4-00000000049b 41016 1727204190.66323: WORKER PROCESS EXITING 41016 1727204190.66348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204190.66719: done with get_vars() 41016 1727204190.66731: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:56:30 -0400 (0:00:00.059) 0:00:14.344 ***** 41016 1727204190.66837: entering _queue_task() for managed-node1/stat 41016 1727204190.67219: worker is 1 (out of 1 available) 41016 1727204190.67231: exiting _queue_task() for managed-node1/stat 41016 1727204190.67241: done queuing things up, now waiting for results queue to drain 41016 1727204190.67242: waiting for pending results... 41016 1727204190.67439: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 41016 1727204190.67599: in run() - task 028d2410-947f-12d5-0ec4-00000000049d 41016 1727204190.67626: variable 'ansible_search_path' from source: unknown 41016 1727204190.67635: variable 'ansible_search_path' from source: unknown 41016 1727204190.67679: calling self._execute() 41016 1727204190.67763: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204190.67774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204190.67790: variable 'omit' from source: magic vars 41016 1727204190.68148: variable 'ansible_distribution_major_version' from source: facts 41016 1727204190.68167: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204190.68329: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41016 1727204190.68615: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41016 1727204190.68701: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41016 1727204190.68722: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41016 1727204190.68763: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41016 1727204190.68883: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41016 1727204190.68898: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41016 1727204190.68935: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204190.68991: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41016 1727204190.69065: variable '__network_is_ostree' from source: set_fact 41016 1727204190.69079: Evaluated conditional (not __network_is_ostree is defined): False 41016 1727204190.69088: when evaluation is False, skipping this task 41016 1727204190.69099: _execute() done 41016 1727204190.69136: dumping result to json 41016 1727204190.69139: done dumping result, returning 41016 1727204190.69142: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [028d2410-947f-12d5-0ec4-00000000049d] 41016 1727204190.69144: sending task result for task 028d2410-947f-12d5-0ec4-00000000049d skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 41016 1727204190.69405: no more pending results, returning what we have 41016 1727204190.69410: results queue empty 41016 1727204190.69412: checking for any_errors_fatal 41016 1727204190.69420: done checking for any_errors_fatal 41016 1727204190.69420: checking for max_fail_percentage 41016 1727204190.69422: done checking for max_fail_percentage 41016 1727204190.69423: checking to see if all hosts have failed and the running result is not ok 41016 1727204190.69424: done checking to see if all hosts have failed 41016 1727204190.69425: getting the remaining hosts for this loop 41016 1727204190.69427: done getting the remaining hosts for this loop 41016 1727204190.69430: getting the next task for host managed-node1 41016 1727204190.69438: done getting next task for host managed-node1 41016 1727204190.69442: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 41016 1727204190.69446: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204190.69578: getting variables 41016 1727204190.69579: in VariableManager get_vars() 41016 1727204190.69618: Calling all_inventory to load vars for managed-node1 41016 1727204190.69621: Calling groups_inventory to load vars for managed-node1 41016 1727204190.69623: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204190.69632: Calling all_plugins_play to load vars for managed-node1 41016 1727204190.69634: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204190.69638: Calling groups_plugins_play to load vars for managed-node1 41016 1727204190.69880: done sending task result for task 028d2410-947f-12d5-0ec4-00000000049d 41016 1727204190.69884: WORKER PROCESS EXITING 41016 1727204190.69914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204190.70134: done with get_vars() 41016 1727204190.70146: done getting variables 41016 1727204190.70204: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:56:30 -0400 (0:00:00.034) 0:00:14.378 ***** 41016 1727204190.70244: entering _queue_task() for managed-node1/set_fact 41016 1727204190.70526: worker is 1 (out of 1 available) 41016 1727204190.70539: exiting _queue_task() for managed-node1/set_fact 41016 1727204190.70662: done queuing things up, now waiting for results queue to drain 41016 1727204190.70664: waiting for pending results... 41016 1727204190.70835: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 41016 1727204190.70995: in run() - task 028d2410-947f-12d5-0ec4-00000000049e 41016 1727204190.71018: variable 'ansible_search_path' from source: unknown 41016 1727204190.71026: variable 'ansible_search_path' from source: unknown 41016 1727204190.71068: calling self._execute() 41016 1727204190.71159: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204190.71171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204190.71186: variable 'omit' from source: magic vars 41016 1727204190.71565: variable 'ansible_distribution_major_version' from source: facts 41016 1727204190.71583: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204190.71761: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41016 1727204190.72129: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41016 1727204190.72179: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41016 1727204190.72297: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41016 1727204190.72301: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41016 1727204190.72348: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41016 1727204190.72381: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41016 1727204190.72419: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204190.72449: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41016 1727204190.72542: variable '__network_is_ostree' from source: set_fact 41016 1727204190.72555: Evaluated conditional (not __network_is_ostree is defined): False 41016 1727204190.72563: when evaluation is False, skipping this task 41016 1727204190.72571: _execute() done 41016 1727204190.72580: dumping result to json 41016 1727204190.72594: done dumping result, returning 41016 1727204190.72601: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [028d2410-947f-12d5-0ec4-00000000049e] 41016 1727204190.72604: sending task result for task 028d2410-947f-12d5-0ec4-00000000049e 41016 1727204190.72728: done sending task result for task 028d2410-947f-12d5-0ec4-00000000049e 41016 1727204190.72731: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 41016 1727204190.72774: no more pending results, returning what we have 41016 1727204190.72782: results queue empty 41016 1727204190.72786: checking for any_errors_fatal 41016 1727204190.72794: done checking for any_errors_fatal 41016 1727204190.72795: checking for max_fail_percentage 41016 1727204190.72796: done checking for max_fail_percentage 41016 1727204190.72797: checking to see if all hosts have failed and the running result is not ok 41016 1727204190.72798: done checking to see if all hosts have failed 41016 1727204190.72799: getting the remaining hosts for this loop 41016 1727204190.72800: done getting the remaining hosts for this loop 41016 1727204190.72803: getting the next task for host managed-node1 41016 1727204190.72815: done getting next task for host managed-node1 41016 1727204190.72819: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 41016 1727204190.72822: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204190.72838: getting variables 41016 1727204190.72839: in VariableManager get_vars() 41016 1727204190.72880: Calling all_inventory to load vars for managed-node1 41016 1727204190.72883: Calling groups_inventory to load vars for managed-node1 41016 1727204190.72885: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204190.72897: Calling all_plugins_play to load vars for managed-node1 41016 1727204190.72900: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204190.72903: Calling groups_plugins_play to load vars for managed-node1 41016 1727204190.73071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204190.73197: done with get_vars() 41016 1727204190.73205: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:56:30 -0400 (0:00:00.030) 0:00:14.408 ***** 41016 1727204190.73272: entering _queue_task() for managed-node1/service_facts 41016 1727204190.73274: Creating lock for service_facts 41016 1727204190.73498: worker is 1 (out of 1 available) 41016 1727204190.73513: exiting _queue_task() for managed-node1/service_facts 41016 1727204190.73524: done queuing things up, now waiting for results queue to drain 41016 1727204190.73525: waiting for pending results... 41016 1727204190.73697: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running 41016 1727204190.73788: in run() - task 028d2410-947f-12d5-0ec4-0000000004a0 41016 1727204190.73799: variable 'ansible_search_path' from source: unknown 41016 1727204190.73803: variable 'ansible_search_path' from source: unknown 41016 1727204190.73830: calling self._execute() 41016 1727204190.73894: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204190.73898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204190.73908: variable 'omit' from source: magic vars 41016 1727204190.74170: variable 'ansible_distribution_major_version' from source: facts 41016 1727204190.74181: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204190.74190: variable 'omit' from source: magic vars 41016 1727204190.74236: variable 'omit' from source: magic vars 41016 1727204190.74259: variable 'omit' from source: magic vars 41016 1727204190.74291: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204190.74322: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204190.74336: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204190.74354: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204190.74396: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204190.74403: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204190.74407: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204190.74413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204190.74580: Set connection var ansible_shell_executable to /bin/sh 41016 1727204190.74583: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204190.74585: Set connection var ansible_shell_type to sh 41016 1727204190.74587: Set connection var ansible_timeout to 10 41016 1727204190.74590: Set connection var ansible_pipelining to False 41016 1727204190.74592: Set connection var ansible_connection to ssh 41016 1727204190.74594: variable 'ansible_shell_executable' from source: unknown 41016 1727204190.74596: variable 'ansible_connection' from source: unknown 41016 1727204190.74598: variable 'ansible_module_compression' from source: unknown 41016 1727204190.74600: variable 'ansible_shell_type' from source: unknown 41016 1727204190.74602: variable 'ansible_shell_executable' from source: unknown 41016 1727204190.74604: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204190.74606: variable 'ansible_pipelining' from source: unknown 41016 1727204190.74607: variable 'ansible_timeout' from source: unknown 41016 1727204190.74609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204190.74801: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 41016 1727204190.74818: variable 'omit' from source: magic vars 41016 1727204190.74829: starting attempt loop 41016 1727204190.74836: running the handler 41016 1727204190.74855: _low_level_execute_command(): starting 41016 1727204190.74981: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204190.75514: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204190.75532: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 41016 1727204190.75550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204190.75617: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204190.75630: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204190.75726: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204190.77497: stdout chunk (state=3): >>>/root <<< 41016 1727204190.77657: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204190.77661: stdout chunk (state=3): >>><<< 41016 1727204190.77663: stderr chunk (state=3): >>><<< 41016 1727204190.77683: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204190.77702: _low_level_execute_command(): starting 41016 1727204190.77791: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204190.7768931-42512-223024527952659 `" && echo ansible-tmp-1727204190.7768931-42512-223024527952659="` echo /root/.ansible/tmp/ansible-tmp-1727204190.7768931-42512-223024527952659 `" ) && sleep 0' 41016 1727204190.78367: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204190.78390: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204190.78405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204190.78527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204190.78560: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204190.78670: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204190.80765: stdout chunk (state=3): >>>ansible-tmp-1727204190.7768931-42512-223024527952659=/root/.ansible/tmp/ansible-tmp-1727204190.7768931-42512-223024527952659 <<< 41016 1727204190.80944: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204190.80948: stdout chunk (state=3): >>><<< 41016 1727204190.80950: stderr chunk (state=3): >>><<< 41016 1727204190.80964: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204190.7768931-42512-223024527952659=/root/.ansible/tmp/ansible-tmp-1727204190.7768931-42512-223024527952659 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204190.81049: variable 'ansible_module_compression' from source: unknown 41016 1727204190.81157: ANSIBALLZ: Using lock for service_facts 41016 1727204190.81162: ANSIBALLZ: Acquiring lock 41016 1727204190.81164: ANSIBALLZ: Lock acquired: 140580607103088 41016 1727204190.81166: ANSIBALLZ: Creating module 41016 1727204190.90859: ANSIBALLZ: Writing module into payload 41016 1727204190.90930: ANSIBALLZ: Writing module 41016 1727204190.90957: ANSIBALLZ: Renaming module 41016 1727204190.90961: ANSIBALLZ: Done creating module 41016 1727204190.90980: variable 'ansible_facts' from source: unknown 41016 1727204190.91030: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204190.7768931-42512-223024527952659/AnsiballZ_service_facts.py 41016 1727204190.91139: Sending initial data 41016 1727204190.91142: Sent initial data (162 bytes) 41016 1727204190.91615: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204190.91621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204190.91624: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 41016 1727204190.91626: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204190.91628: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204190.91682: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204190.91685: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204190.91687: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204190.91774: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204190.93510: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204190.93588: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204190.93664: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmpkl2brtko /root/.ansible/tmp/ansible-tmp-1727204190.7768931-42512-223024527952659/AnsiballZ_service_facts.py <<< 41016 1727204190.93667: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204190.7768931-42512-223024527952659/AnsiballZ_service_facts.py" <<< 41016 1727204190.93735: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmpkl2brtko" to remote "/root/.ansible/tmp/ansible-tmp-1727204190.7768931-42512-223024527952659/AnsiballZ_service_facts.py" <<< 41016 1727204190.93739: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204190.7768931-42512-223024527952659/AnsiballZ_service_facts.py" <<< 41016 1727204190.94460: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204190.94506: stderr chunk (state=3): >>><<< 41016 1727204190.94513: stdout chunk (state=3): >>><<< 41016 1727204190.94549: done transferring module to remote 41016 1727204190.94559: _low_level_execute_command(): starting 41016 1727204190.94564: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204190.7768931-42512-223024527952659/ /root/.ansible/tmp/ansible-tmp-1727204190.7768931-42512-223024527952659/AnsiballZ_service_facts.py && sleep 0' 41016 1727204190.95027: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204190.95030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 41016 1727204190.95033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204190.95035: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204190.95037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204190.95039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204190.95081: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204190.95103: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204190.95174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204190.97102: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204190.97130: stderr chunk (state=3): >>><<< 41016 1727204190.97134: stdout chunk (state=3): >>><<< 41016 1727204190.97150: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204190.97153: _low_level_execute_command(): starting 41016 1727204190.97158: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204190.7768931-42512-223024527952659/AnsiballZ_service_facts.py && sleep 0' 41016 1727204190.97580: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204190.97606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204190.97611: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204190.97614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204190.97665: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204190.97672: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204190.97674: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204190.97755: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204192.74212: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-ma<<< 41016 1727204192.74264: stdout chunk (state=3): >>>rk.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 41016 1727204192.76092: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204192.76182: stderr chunk (state=3): >>>Shared connection to 10.31.14.47 closed. <<< 41016 1727204192.76187: stdout chunk (state=3): >>><<< 41016 1727204192.76191: stderr chunk (state=3): >>><<< 41016 1727204192.76195: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204192.76852: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204190.7768931-42512-223024527952659/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204192.76879: _low_level_execute_command(): starting 41016 1727204192.76890: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204190.7768931-42512-223024527952659/ > /dev/null 2>&1 && sleep 0' 41016 1727204192.77542: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204192.77556: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204192.77570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204192.77588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204192.77604: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204192.77618: stderr chunk (state=3): >>>debug2: match not found <<< 41016 1727204192.77730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204192.77733: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204192.77785: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204192.77862: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204192.79933: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204192.79937: stdout chunk (state=3): >>><<< 41016 1727204192.79946: stderr chunk (state=3): >>><<< 41016 1727204192.79961: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204192.79968: handler run complete 41016 1727204192.80192: variable 'ansible_facts' from source: unknown 41016 1727204192.80353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204192.80849: variable 'ansible_facts' from source: unknown 41016 1727204192.82183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204192.82403: attempt loop complete, returning result 41016 1727204192.82418: _execute() done 41016 1727204192.82424: dumping result to json 41016 1727204192.82494: done dumping result, returning 41016 1727204192.82680: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running [028d2410-947f-12d5-0ec4-0000000004a0] 41016 1727204192.82683: sending task result for task 028d2410-947f-12d5-0ec4-0000000004a0 ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41016 1727204192.83381: no more pending results, returning what we have 41016 1727204192.83384: results queue empty 41016 1727204192.83385: checking for any_errors_fatal 41016 1727204192.83390: done checking for any_errors_fatal 41016 1727204192.83390: checking for max_fail_percentage 41016 1727204192.83392: done checking for max_fail_percentage 41016 1727204192.83393: checking to see if all hosts have failed and the running result is not ok 41016 1727204192.83394: done checking to see if all hosts have failed 41016 1727204192.83394: getting the remaining hosts for this loop 41016 1727204192.83396: done getting the remaining hosts for this loop 41016 1727204192.83399: getting the next task for host managed-node1 41016 1727204192.83405: done getting next task for host managed-node1 41016 1727204192.83479: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 41016 1727204192.83483: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204192.83492: getting variables 41016 1727204192.83494: in VariableManager get_vars() 41016 1727204192.83629: Calling all_inventory to load vars for managed-node1 41016 1727204192.83632: Calling groups_inventory to load vars for managed-node1 41016 1727204192.83634: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204192.83640: done sending task result for task 028d2410-947f-12d5-0ec4-0000000004a0 41016 1727204192.83643: WORKER PROCESS EXITING 41016 1727204192.83651: Calling all_plugins_play to load vars for managed-node1 41016 1727204192.83653: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204192.83656: Calling groups_plugins_play to load vars for managed-node1 41016 1727204192.84051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204192.84565: done with get_vars() 41016 1727204192.84581: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:56:32 -0400 (0:00:02.114) 0:00:16.522 ***** 41016 1727204192.84683: entering _queue_task() for managed-node1/package_facts 41016 1727204192.84685: Creating lock for package_facts 41016 1727204192.84993: worker is 1 (out of 1 available) 41016 1727204192.85006: exiting _queue_task() for managed-node1/package_facts 41016 1727204192.85021: done queuing things up, now waiting for results queue to drain 41016 1727204192.85022: waiting for pending results... 41016 1727204192.85306: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 41016 1727204192.85455: in run() - task 028d2410-947f-12d5-0ec4-0000000004a1 41016 1727204192.85482: variable 'ansible_search_path' from source: unknown 41016 1727204192.85491: variable 'ansible_search_path' from source: unknown 41016 1727204192.85541: calling self._execute() 41016 1727204192.85643: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204192.85655: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204192.85668: variable 'omit' from source: magic vars 41016 1727204192.86121: variable 'ansible_distribution_major_version' from source: facts 41016 1727204192.86139: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204192.86155: variable 'omit' from source: magic vars 41016 1727204192.86242: variable 'omit' from source: magic vars 41016 1727204192.86290: variable 'omit' from source: magic vars 41016 1727204192.86339: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204192.86385: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204192.86414: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204192.86438: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204192.86456: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204192.86496: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204192.86506: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204192.86517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204192.86627: Set connection var ansible_shell_executable to /bin/sh 41016 1727204192.86639: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204192.86650: Set connection var ansible_shell_type to sh 41016 1727204192.86659: Set connection var ansible_timeout to 10 41016 1727204192.86670: Set connection var ansible_pipelining to False 41016 1727204192.86684: Set connection var ansible_connection to ssh 41016 1727204192.86716: variable 'ansible_shell_executable' from source: unknown 41016 1727204192.86725: variable 'ansible_connection' from source: unknown 41016 1727204192.86733: variable 'ansible_module_compression' from source: unknown 41016 1727204192.86739: variable 'ansible_shell_type' from source: unknown 41016 1727204192.86747: variable 'ansible_shell_executable' from source: unknown 41016 1727204192.86754: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204192.86762: variable 'ansible_pipelining' from source: unknown 41016 1727204192.86769: variable 'ansible_timeout' from source: unknown 41016 1727204192.86779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204192.86985: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 41016 1727204192.87024: variable 'omit' from source: magic vars 41016 1727204192.87027: starting attempt loop 41016 1727204192.87030: running the handler 41016 1727204192.87035: _low_level_execute_command(): starting 41016 1727204192.87049: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204192.87788: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204192.87898: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204192.87918: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204192.87941: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204192.88049: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204192.89858: stdout chunk (state=3): >>>/root <<< 41016 1727204192.90007: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204192.90027: stdout chunk (state=3): >>><<< 41016 1727204192.90048: stderr chunk (state=3): >>><<< 41016 1727204192.90074: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204192.90185: _low_level_execute_command(): starting 41016 1727204192.90189: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204192.9008443-42557-263651355735761 `" && echo ansible-tmp-1727204192.9008443-42557-263651355735761="` echo /root/.ansible/tmp/ansible-tmp-1727204192.9008443-42557-263651355735761 `" ) && sleep 0' 41016 1727204192.90793: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204192.90808: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204192.90849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204192.90853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204192.90949: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204192.91004: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204192.91096: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204192.93179: stdout chunk (state=3): >>>ansible-tmp-1727204192.9008443-42557-263651355735761=/root/.ansible/tmp/ansible-tmp-1727204192.9008443-42557-263651355735761 <<< 41016 1727204192.93372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204192.93389: stderr chunk (state=3): >>><<< 41016 1727204192.93398: stdout chunk (state=3): >>><<< 41016 1727204192.93428: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204192.9008443-42557-263651355735761=/root/.ansible/tmp/ansible-tmp-1727204192.9008443-42557-263651355735761 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204192.93488: variable 'ansible_module_compression' from source: unknown 41016 1727204192.93582: ANSIBALLZ: Using lock for package_facts 41016 1727204192.93585: ANSIBALLZ: Acquiring lock 41016 1727204192.93588: ANSIBALLZ: Lock acquired: 140580606720288 41016 1727204192.93590: ANSIBALLZ: Creating module 41016 1727204193.34474: ANSIBALLZ: Writing module into payload 41016 1727204193.34535: ANSIBALLZ: Writing module 41016 1727204193.34627: ANSIBALLZ: Renaming module 41016 1727204193.34781: ANSIBALLZ: Done creating module 41016 1727204193.34784: variable 'ansible_facts' from source: unknown 41016 1727204193.35164: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204192.9008443-42557-263651355735761/AnsiballZ_package_facts.py 41016 1727204193.35588: Sending initial data 41016 1727204193.35591: Sent initial data (162 bytes) 41016 1727204193.36960: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204193.37119: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204193.37290: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204193.37418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204193.39192: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204193.39268: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204193.39380: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmp_169ggri /root/.ansible/tmp/ansible-tmp-1727204192.9008443-42557-263651355735761/AnsiballZ_package_facts.py <<< 41016 1727204193.39410: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204192.9008443-42557-263651355735761/AnsiballZ_package_facts.py" <<< 41016 1727204193.39490: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmp_169ggri" to remote "/root/.ansible/tmp/ansible-tmp-1727204192.9008443-42557-263651355735761/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204192.9008443-42557-263651355735761/AnsiballZ_package_facts.py" <<< 41016 1727204193.42280: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204193.42294: stdout chunk (state=3): >>><<< 41016 1727204193.42469: stderr chunk (state=3): >>><<< 41016 1727204193.42472: done transferring module to remote 41016 1727204193.42477: _low_level_execute_command(): starting 41016 1727204193.42482: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204192.9008443-42557-263651355735761/ /root/.ansible/tmp/ansible-tmp-1727204192.9008443-42557-263651355735761/AnsiballZ_package_facts.py && sleep 0' 41016 1727204193.43700: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204193.43720: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204193.43772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204193.43799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204193.43848: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204193.43971: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41016 1727204193.44145: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204193.44197: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204193.44338: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204193.46332: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204193.46341: stdout chunk (state=3): >>><<< 41016 1727204193.46350: stderr chunk (state=3): >>><<< 41016 1727204193.46367: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204193.46374: _low_level_execute_command(): starting 41016 1727204193.46386: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204192.9008443-42557-263651355735761/AnsiballZ_package_facts.py && sleep 0' 41016 1727204193.47643: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204193.47661: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204193.47673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204193.47695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204193.47714: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204193.47725: stderr chunk (state=3): >>>debug2: match not found <<< 41016 1727204193.47869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204193.48112: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204193.48200: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204193.95588: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 41016 1727204193.95610: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el1<<< 41016 1727204193.95790: stdout chunk (state=3): >>>0", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 41016 1727204193.97834: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 41016 1727204193.97838: stdout chunk (state=3): >>><<< 41016 1727204193.97841: stderr chunk (state=3): >>><<< 41016 1727204193.97966: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204194.02142: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204192.9008443-42557-263651355735761/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204194.02382: _low_level_execute_command(): starting 41016 1727204194.02386: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204192.9008443-42557-263651355735761/ > /dev/null 2>&1 && sleep 0' 41016 1727204194.03653: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204194.03657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 41016 1727204194.03659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204194.03662: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204194.03665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204194.03847: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204194.03851: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204194.03893: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204194.04084: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204194.06074: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204194.06216: stderr chunk (state=3): >>><<< 41016 1727204194.06219: stdout chunk (state=3): >>><<< 41016 1727204194.06221: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204194.06223: handler run complete 41016 1727204194.07245: variable 'ansible_facts' from source: unknown 41016 1727204194.07683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204194.09895: variable 'ansible_facts' from source: unknown 41016 1727204194.10383: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204194.11151: attempt loop complete, returning result 41016 1727204194.11154: _execute() done 41016 1727204194.11156: dumping result to json 41016 1727204194.11331: done dumping result, returning 41016 1727204194.11344: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [028d2410-947f-12d5-0ec4-0000000004a1] 41016 1727204194.11351: sending task result for task 028d2410-947f-12d5-0ec4-0000000004a1 41016 1727204194.15365: done sending task result for task 028d2410-947f-12d5-0ec4-0000000004a1 41016 1727204194.15369: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41016 1727204194.15471: no more pending results, returning what we have 41016 1727204194.15474: results queue empty 41016 1727204194.15480: checking for any_errors_fatal 41016 1727204194.15485: done checking for any_errors_fatal 41016 1727204194.15486: checking for max_fail_percentage 41016 1727204194.15487: done checking for max_fail_percentage 41016 1727204194.15488: checking to see if all hosts have failed and the running result is not ok 41016 1727204194.15489: done checking to see if all hosts have failed 41016 1727204194.15490: getting the remaining hosts for this loop 41016 1727204194.15491: done getting the remaining hosts for this loop 41016 1727204194.15494: getting the next task for host managed-node1 41016 1727204194.15501: done getting next task for host managed-node1 41016 1727204194.15505: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 41016 1727204194.15507: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204194.15520: getting variables 41016 1727204194.15521: in VariableManager get_vars() 41016 1727204194.15556: Calling all_inventory to load vars for managed-node1 41016 1727204194.15559: Calling groups_inventory to load vars for managed-node1 41016 1727204194.15561: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204194.15570: Calling all_plugins_play to load vars for managed-node1 41016 1727204194.15573: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204194.15703: Calling groups_plugins_play to load vars for managed-node1 41016 1727204194.18151: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204194.21414: done with get_vars() 41016 1727204194.21445: done getting variables 41016 1727204194.21625: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:56:34 -0400 (0:00:01.369) 0:00:17.892 ***** 41016 1727204194.21665: entering _queue_task() for managed-node1/debug 41016 1727204194.22368: worker is 1 (out of 1 available) 41016 1727204194.22517: exiting _queue_task() for managed-node1/debug 41016 1727204194.22529: done queuing things up, now waiting for results queue to drain 41016 1727204194.22530: waiting for pending results... 41016 1727204194.22970: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider 41016 1727204194.23141: in run() - task 028d2410-947f-12d5-0ec4-00000000001c 41016 1727204194.23243: variable 'ansible_search_path' from source: unknown 41016 1727204194.23246: variable 'ansible_search_path' from source: unknown 41016 1727204194.23355: calling self._execute() 41016 1727204194.23453: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204194.23483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204194.23591: variable 'omit' from source: magic vars 41016 1727204194.24327: variable 'ansible_distribution_major_version' from source: facts 41016 1727204194.24351: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204194.24560: variable 'omit' from source: magic vars 41016 1727204194.24563: variable 'omit' from source: magic vars 41016 1727204194.24727: variable 'network_provider' from source: set_fact 41016 1727204194.24751: variable 'omit' from source: magic vars 41016 1727204194.24984: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204194.24990: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204194.24992: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204194.25006: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204194.25023: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204194.25082: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204194.25147: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204194.25158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204194.25481: Set connection var ansible_shell_executable to /bin/sh 41016 1727204194.25484: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204194.25487: Set connection var ansible_shell_type to sh 41016 1727204194.25489: Set connection var ansible_timeout to 10 41016 1727204194.25491: Set connection var ansible_pipelining to False 41016 1727204194.25493: Set connection var ansible_connection to ssh 41016 1727204194.25495: variable 'ansible_shell_executable' from source: unknown 41016 1727204194.25497: variable 'ansible_connection' from source: unknown 41016 1727204194.25499: variable 'ansible_module_compression' from source: unknown 41016 1727204194.25502: variable 'ansible_shell_type' from source: unknown 41016 1727204194.25504: variable 'ansible_shell_executable' from source: unknown 41016 1727204194.25506: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204194.25508: variable 'ansible_pipelining' from source: unknown 41016 1727204194.25561: variable 'ansible_timeout' from source: unknown 41016 1727204194.25596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204194.25872: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204194.25959: variable 'omit' from source: magic vars 41016 1727204194.25964: starting attempt loop 41016 1727204194.25968: running the handler 41016 1727204194.26017: handler run complete 41016 1727204194.26031: attempt loop complete, returning result 41016 1727204194.26034: _execute() done 41016 1727204194.26037: dumping result to json 41016 1727204194.26039: done dumping result, returning 41016 1727204194.26048: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider [028d2410-947f-12d5-0ec4-00000000001c] 41016 1727204194.26050: sending task result for task 028d2410-947f-12d5-0ec4-00000000001c ok: [managed-node1] => {} MSG: Using network provider: nm 41016 1727204194.26248: no more pending results, returning what we have 41016 1727204194.26252: results queue empty 41016 1727204194.26253: checking for any_errors_fatal 41016 1727204194.26262: done checking for any_errors_fatal 41016 1727204194.26263: checking for max_fail_percentage 41016 1727204194.26264: done checking for max_fail_percentage 41016 1727204194.26265: checking to see if all hosts have failed and the running result is not ok 41016 1727204194.26266: done checking to see if all hosts have failed 41016 1727204194.26267: getting the remaining hosts for this loop 41016 1727204194.26268: done getting the remaining hosts for this loop 41016 1727204194.26271: getting the next task for host managed-node1 41016 1727204194.26280: done getting next task for host managed-node1 41016 1727204194.26284: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 41016 1727204194.26287: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204194.26302: getting variables 41016 1727204194.26305: in VariableManager get_vars() 41016 1727204194.26351: Calling all_inventory to load vars for managed-node1 41016 1727204194.26354: Calling groups_inventory to load vars for managed-node1 41016 1727204194.26357: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204194.26365: Calling all_plugins_play to load vars for managed-node1 41016 1727204194.26368: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204194.26371: Calling groups_plugins_play to load vars for managed-node1 41016 1727204194.27191: done sending task result for task 028d2410-947f-12d5-0ec4-00000000001c 41016 1727204194.27194: WORKER PROCESS EXITING 41016 1727204194.29382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204194.32753: done with get_vars() 41016 1727204194.32891: done getting variables 41016 1727204194.33072: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:56:34 -0400 (0:00:00.114) 0:00:18.007 ***** 41016 1727204194.33112: entering _queue_task() for managed-node1/fail 41016 1727204194.33817: worker is 1 (out of 1 available) 41016 1727204194.33830: exiting _queue_task() for managed-node1/fail 41016 1727204194.33841: done queuing things up, now waiting for results queue to drain 41016 1727204194.33842: waiting for pending results... 41016 1727204194.34342: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 41016 1727204194.34660: in run() - task 028d2410-947f-12d5-0ec4-00000000001d 41016 1727204194.34769: variable 'ansible_search_path' from source: unknown 41016 1727204194.34773: variable 'ansible_search_path' from source: unknown 41016 1727204194.34778: calling self._execute() 41016 1727204194.34917: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204194.34928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204194.34995: variable 'omit' from source: magic vars 41016 1727204194.35679: variable 'ansible_distribution_major_version' from source: facts 41016 1727204194.35757: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204194.36083: variable 'network_state' from source: role '' defaults 41016 1727204194.36088: Evaluated conditional (network_state != {}): False 41016 1727204194.36093: when evaluation is False, skipping this task 41016 1727204194.36096: _execute() done 41016 1727204194.36099: dumping result to json 41016 1727204194.36102: done dumping result, returning 41016 1727204194.36108: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [028d2410-947f-12d5-0ec4-00000000001d] 41016 1727204194.36114: sending task result for task 028d2410-947f-12d5-0ec4-00000000001d 41016 1727204194.36191: done sending task result for task 028d2410-947f-12d5-0ec4-00000000001d 41016 1727204194.36195: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41016 1727204194.36249: no more pending results, returning what we have 41016 1727204194.36254: results queue empty 41016 1727204194.36256: checking for any_errors_fatal 41016 1727204194.36264: done checking for any_errors_fatal 41016 1727204194.36264: checking for max_fail_percentage 41016 1727204194.36266: done checking for max_fail_percentage 41016 1727204194.36267: checking to see if all hosts have failed and the running result is not ok 41016 1727204194.36268: done checking to see if all hosts have failed 41016 1727204194.36269: getting the remaining hosts for this loop 41016 1727204194.36270: done getting the remaining hosts for this loop 41016 1727204194.36274: getting the next task for host managed-node1 41016 1727204194.36283: done getting next task for host managed-node1 41016 1727204194.36287: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 41016 1727204194.36290: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204194.36305: getting variables 41016 1727204194.36307: in VariableManager get_vars() 41016 1727204194.36350: Calling all_inventory to load vars for managed-node1 41016 1727204194.36353: Calling groups_inventory to load vars for managed-node1 41016 1727204194.36355: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204194.36365: Calling all_plugins_play to load vars for managed-node1 41016 1727204194.36367: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204194.36370: Calling groups_plugins_play to load vars for managed-node1 41016 1727204194.39802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204194.43658: done with get_vars() 41016 1727204194.43706: done getting variables 41016 1727204194.43766: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:56:34 -0400 (0:00:00.106) 0:00:18.114 ***** 41016 1727204194.43808: entering _queue_task() for managed-node1/fail 41016 1727204194.44149: worker is 1 (out of 1 available) 41016 1727204194.44164: exiting _queue_task() for managed-node1/fail 41016 1727204194.44180: done queuing things up, now waiting for results queue to drain 41016 1727204194.44182: waiting for pending results... 41016 1727204194.44436: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 41016 1727204194.44578: in run() - task 028d2410-947f-12d5-0ec4-00000000001e 41016 1727204194.44598: variable 'ansible_search_path' from source: unknown 41016 1727204194.44605: variable 'ansible_search_path' from source: unknown 41016 1727204194.44643: calling self._execute() 41016 1727204194.44744: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204194.44755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204194.44769: variable 'omit' from source: magic vars 41016 1727204194.45158: variable 'ansible_distribution_major_version' from source: facts 41016 1727204194.45195: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204194.45422: variable 'network_state' from source: role '' defaults 41016 1727204194.45426: Evaluated conditional (network_state != {}): False 41016 1727204194.45430: when evaluation is False, skipping this task 41016 1727204194.45432: _execute() done 41016 1727204194.45435: dumping result to json 41016 1727204194.45437: done dumping result, returning 41016 1727204194.45440: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [028d2410-947f-12d5-0ec4-00000000001e] 41016 1727204194.45443: sending task result for task 028d2410-947f-12d5-0ec4-00000000001e 41016 1727204194.46000: done sending task result for task 028d2410-947f-12d5-0ec4-00000000001e 41016 1727204194.46004: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41016 1727204194.46045: no more pending results, returning what we have 41016 1727204194.46049: results queue empty 41016 1727204194.46050: checking for any_errors_fatal 41016 1727204194.46057: done checking for any_errors_fatal 41016 1727204194.46058: checking for max_fail_percentage 41016 1727204194.46060: done checking for max_fail_percentage 41016 1727204194.46061: checking to see if all hosts have failed and the running result is not ok 41016 1727204194.46062: done checking to see if all hosts have failed 41016 1727204194.46062: getting the remaining hosts for this loop 41016 1727204194.46064: done getting the remaining hosts for this loop 41016 1727204194.46067: getting the next task for host managed-node1 41016 1727204194.46073: done getting next task for host managed-node1 41016 1727204194.46079: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 41016 1727204194.46083: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204194.46097: getting variables 41016 1727204194.46099: in VariableManager get_vars() 41016 1727204194.46136: Calling all_inventory to load vars for managed-node1 41016 1727204194.46140: Calling groups_inventory to load vars for managed-node1 41016 1727204194.46142: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204194.46151: Calling all_plugins_play to load vars for managed-node1 41016 1727204194.46154: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204194.46157: Calling groups_plugins_play to load vars for managed-node1 41016 1727204194.47802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204194.49758: done with get_vars() 41016 1727204194.49897: done getting variables 41016 1727204194.49958: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:56:34 -0400 (0:00:00.062) 0:00:18.176 ***** 41016 1727204194.50028: entering _queue_task() for managed-node1/fail 41016 1727204194.50554: worker is 1 (out of 1 available) 41016 1727204194.50567: exiting _queue_task() for managed-node1/fail 41016 1727204194.50745: done queuing things up, now waiting for results queue to drain 41016 1727204194.50747: waiting for pending results... 41016 1727204194.51083: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 41016 1727204194.51244: in run() - task 028d2410-947f-12d5-0ec4-00000000001f 41016 1727204194.51267: variable 'ansible_search_path' from source: unknown 41016 1727204194.51282: variable 'ansible_search_path' from source: unknown 41016 1727204194.51351: calling self._execute() 41016 1727204194.51457: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204194.51471: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204194.51489: variable 'omit' from source: magic vars 41016 1727204194.51886: variable 'ansible_distribution_major_version' from source: facts 41016 1727204194.51905: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204194.52103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41016 1727204194.54454: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41016 1727204194.54527: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41016 1727204194.54665: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41016 1727204194.54668: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41016 1727204194.54671: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41016 1727204194.54742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204194.54790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204194.54824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204194.54927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204194.54966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204194.55067: variable 'ansible_distribution_major_version' from source: facts 41016 1727204194.55090: Evaluated conditional (ansible_distribution_major_version | int > 9): True 41016 1727204194.55260: variable 'ansible_distribution' from source: facts 41016 1727204194.55369: variable '__network_rh_distros' from source: role '' defaults 41016 1727204194.55372: Evaluated conditional (ansible_distribution in __network_rh_distros): True 41016 1727204194.55810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204194.55984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204194.55988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204194.55991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204194.56234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204194.56237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204194.56240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204194.56242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204194.56321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204194.56470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204194.56539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204194.56588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204194.56622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204194.56684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204194.56711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204194.57024: variable 'network_connections' from source: task vars 41016 1727204194.57042: variable 'interface0' from source: play vars 41016 1727204194.57122: variable 'interface0' from source: play vars 41016 1727204194.57135: variable 'interface0' from source: play vars 41016 1727204194.57198: variable 'interface0' from source: play vars 41016 1727204194.57225: variable 'interface1' from source: play vars 41016 1727204194.57291: variable 'interface1' from source: play vars 41016 1727204194.57304: variable 'interface1' from source: play vars 41016 1727204194.57373: variable 'interface1' from source: play vars 41016 1727204194.57395: variable 'network_state' from source: role '' defaults 41016 1727204194.57472: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41016 1727204194.57663: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41016 1727204194.57752: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41016 1727204194.57755: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41016 1727204194.57786: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41016 1727204194.57844: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41016 1727204194.57881: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41016 1727204194.57913: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204194.57943: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41016 1727204194.58081: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 41016 1727204194.58084: when evaluation is False, skipping this task 41016 1727204194.58086: _execute() done 41016 1727204194.58088: dumping result to json 41016 1727204194.58090: done dumping result, returning 41016 1727204194.58092: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [028d2410-947f-12d5-0ec4-00000000001f] 41016 1727204194.58094: sending task result for task 028d2410-947f-12d5-0ec4-00000000001f 41016 1727204194.58157: done sending task result for task 028d2410-947f-12d5-0ec4-00000000001f 41016 1727204194.58160: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 41016 1727204194.58230: no more pending results, returning what we have 41016 1727204194.58234: results queue empty 41016 1727204194.58235: checking for any_errors_fatal 41016 1727204194.58243: done checking for any_errors_fatal 41016 1727204194.58243: checking for max_fail_percentage 41016 1727204194.58245: done checking for max_fail_percentage 41016 1727204194.58246: checking to see if all hosts have failed and the running result is not ok 41016 1727204194.58247: done checking to see if all hosts have failed 41016 1727204194.58248: getting the remaining hosts for this loop 41016 1727204194.58249: done getting the remaining hosts for this loop 41016 1727204194.58253: getting the next task for host managed-node1 41016 1727204194.58261: done getting next task for host managed-node1 41016 1727204194.58265: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 41016 1727204194.58268: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204194.58467: getting variables 41016 1727204194.58472: in VariableManager get_vars() 41016 1727204194.58515: Calling all_inventory to load vars for managed-node1 41016 1727204194.58519: Calling groups_inventory to load vars for managed-node1 41016 1727204194.58521: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204194.58531: Calling all_plugins_play to load vars for managed-node1 41016 1727204194.58534: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204194.58537: Calling groups_plugins_play to load vars for managed-node1 41016 1727204194.60046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204194.64691: done with get_vars() 41016 1727204194.64719: done getting variables 41016 1727204194.64808: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:56:34 -0400 (0:00:00.148) 0:00:18.324 ***** 41016 1727204194.64838: entering _queue_task() for managed-node1/dnf 41016 1727204194.65313: worker is 1 (out of 1 available) 41016 1727204194.65325: exiting _queue_task() for managed-node1/dnf 41016 1727204194.65336: done queuing things up, now waiting for results queue to drain 41016 1727204194.65337: waiting for pending results... 41016 1727204194.65502: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 41016 1727204194.65599: in run() - task 028d2410-947f-12d5-0ec4-000000000020 41016 1727204194.65610: variable 'ansible_search_path' from source: unknown 41016 1727204194.65627: variable 'ansible_search_path' from source: unknown 41016 1727204194.65783: calling self._execute() 41016 1727204194.65786: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204194.65790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204194.65793: variable 'omit' from source: magic vars 41016 1727204194.66190: variable 'ansible_distribution_major_version' from source: facts 41016 1727204194.66207: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204194.66419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41016 1727204194.68022: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41016 1727204194.68073: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41016 1727204194.68104: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41016 1727204194.68132: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41016 1727204194.68151: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41016 1727204194.68215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204194.68235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204194.68254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204194.68280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204194.68292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204194.68379: variable 'ansible_distribution' from source: facts 41016 1727204194.68580: variable 'ansible_distribution_major_version' from source: facts 41016 1727204194.68583: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 41016 1727204194.68586: variable '__network_wireless_connections_defined' from source: role '' defaults 41016 1727204194.68649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204194.68682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204194.68715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204194.68761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204194.68796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204194.68839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204194.68867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204194.68899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204194.68941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204194.68959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204194.69003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204194.69031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204194.69057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204194.69101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204194.69121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204194.69277: variable 'network_connections' from source: task vars 41016 1727204194.69295: variable 'interface0' from source: play vars 41016 1727204194.69360: variable 'interface0' from source: play vars 41016 1727204194.69378: variable 'interface0' from source: play vars 41016 1727204194.69446: variable 'interface0' from source: play vars 41016 1727204194.69463: variable 'interface1' from source: play vars 41016 1727204194.69507: variable 'interface1' from source: play vars 41016 1727204194.69515: variable 'interface1' from source: play vars 41016 1727204194.69559: variable 'interface1' from source: play vars 41016 1727204194.69611: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41016 1727204194.69744: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41016 1727204194.69771: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41016 1727204194.69798: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41016 1727204194.69821: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41016 1727204194.69853: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41016 1727204194.69867: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41016 1727204194.69893: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204194.69908: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41016 1727204194.69955: variable '__network_team_connections_defined' from source: role '' defaults 41016 1727204194.70110: variable 'network_connections' from source: task vars 41016 1727204194.70114: variable 'interface0' from source: play vars 41016 1727204194.70157: variable 'interface0' from source: play vars 41016 1727204194.70161: variable 'interface0' from source: play vars 41016 1727204194.70203: variable 'interface0' from source: play vars 41016 1727204194.70218: variable 'interface1' from source: play vars 41016 1727204194.70257: variable 'interface1' from source: play vars 41016 1727204194.70262: variable 'interface1' from source: play vars 41016 1727204194.70305: variable 'interface1' from source: play vars 41016 1727204194.70335: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41016 1727204194.70338: when evaluation is False, skipping this task 41016 1727204194.70341: _execute() done 41016 1727204194.70343: dumping result to json 41016 1727204194.70345: done dumping result, returning 41016 1727204194.70353: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [028d2410-947f-12d5-0ec4-000000000020] 41016 1727204194.70356: sending task result for task 028d2410-947f-12d5-0ec4-000000000020 41016 1727204194.70440: done sending task result for task 028d2410-947f-12d5-0ec4-000000000020 41016 1727204194.70442: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41016 1727204194.70492: no more pending results, returning what we have 41016 1727204194.70496: results queue empty 41016 1727204194.70497: checking for any_errors_fatal 41016 1727204194.70503: done checking for any_errors_fatal 41016 1727204194.70504: checking for max_fail_percentage 41016 1727204194.70506: done checking for max_fail_percentage 41016 1727204194.70506: checking to see if all hosts have failed and the running result is not ok 41016 1727204194.70507: done checking to see if all hosts have failed 41016 1727204194.70508: getting the remaining hosts for this loop 41016 1727204194.70509: done getting the remaining hosts for this loop 41016 1727204194.70513: getting the next task for host managed-node1 41016 1727204194.70520: done getting next task for host managed-node1 41016 1727204194.70524: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 41016 1727204194.70526: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204194.70539: getting variables 41016 1727204194.70541: in VariableManager get_vars() 41016 1727204194.70581: Calling all_inventory to load vars for managed-node1 41016 1727204194.70584: Calling groups_inventory to load vars for managed-node1 41016 1727204194.70587: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204194.70596: Calling all_plugins_play to load vars for managed-node1 41016 1727204194.70599: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204194.70601: Calling groups_plugins_play to load vars for managed-node1 41016 1727204194.71400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204194.72275: done with get_vars() 41016 1727204194.72291: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 41016 1727204194.72344: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:56:34 -0400 (0:00:00.075) 0:00:18.399 ***** 41016 1727204194.72365: entering _queue_task() for managed-node1/yum 41016 1727204194.72366: Creating lock for yum 41016 1727204194.72591: worker is 1 (out of 1 available) 41016 1727204194.72604: exiting _queue_task() for managed-node1/yum 41016 1727204194.72617: done queuing things up, now waiting for results queue to drain 41016 1727204194.72619: waiting for pending results... 41016 1727204194.72792: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 41016 1727204194.72874: in run() - task 028d2410-947f-12d5-0ec4-000000000021 41016 1727204194.72887: variable 'ansible_search_path' from source: unknown 41016 1727204194.72890: variable 'ansible_search_path' from source: unknown 41016 1727204194.72920: calling self._execute() 41016 1727204194.72989: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204194.72994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204194.73002: variable 'omit' from source: magic vars 41016 1727204194.73278: variable 'ansible_distribution_major_version' from source: facts 41016 1727204194.73291: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204194.73408: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41016 1727204194.74907: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41016 1727204194.74961: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41016 1727204194.74990: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41016 1727204194.75018: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41016 1727204194.75040: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41016 1727204194.75097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204194.75120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204194.75139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204194.75167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204194.75179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204194.75250: variable 'ansible_distribution_major_version' from source: facts 41016 1727204194.75260: Evaluated conditional (ansible_distribution_major_version | int < 8): False 41016 1727204194.75263: when evaluation is False, skipping this task 41016 1727204194.75266: _execute() done 41016 1727204194.75269: dumping result to json 41016 1727204194.75271: done dumping result, returning 41016 1727204194.75279: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [028d2410-947f-12d5-0ec4-000000000021] 41016 1727204194.75285: sending task result for task 028d2410-947f-12d5-0ec4-000000000021 41016 1727204194.75367: done sending task result for task 028d2410-947f-12d5-0ec4-000000000021 41016 1727204194.75370: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 41016 1727204194.75425: no more pending results, returning what we have 41016 1727204194.75429: results queue empty 41016 1727204194.75430: checking for any_errors_fatal 41016 1727204194.75438: done checking for any_errors_fatal 41016 1727204194.75439: checking for max_fail_percentage 41016 1727204194.75440: done checking for max_fail_percentage 41016 1727204194.75441: checking to see if all hosts have failed and the running result is not ok 41016 1727204194.75442: done checking to see if all hosts have failed 41016 1727204194.75443: getting the remaining hosts for this loop 41016 1727204194.75444: done getting the remaining hosts for this loop 41016 1727204194.75448: getting the next task for host managed-node1 41016 1727204194.75454: done getting next task for host managed-node1 41016 1727204194.75458: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 41016 1727204194.75460: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204194.75474: getting variables 41016 1727204194.75478: in VariableManager get_vars() 41016 1727204194.75517: Calling all_inventory to load vars for managed-node1 41016 1727204194.75519: Calling groups_inventory to load vars for managed-node1 41016 1727204194.75521: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204194.75530: Calling all_plugins_play to load vars for managed-node1 41016 1727204194.75532: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204194.75534: Calling groups_plugins_play to load vars for managed-node1 41016 1727204194.76434: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204194.77296: done with get_vars() 41016 1727204194.77311: done getting variables 41016 1727204194.77352: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:56:34 -0400 (0:00:00.050) 0:00:18.449 ***** 41016 1727204194.77374: entering _queue_task() for managed-node1/fail 41016 1727204194.77598: worker is 1 (out of 1 available) 41016 1727204194.77610: exiting _queue_task() for managed-node1/fail 41016 1727204194.77622: done queuing things up, now waiting for results queue to drain 41016 1727204194.77624: waiting for pending results... 41016 1727204194.77800: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 41016 1727204194.77888: in run() - task 028d2410-947f-12d5-0ec4-000000000022 41016 1727204194.77898: variable 'ansible_search_path' from source: unknown 41016 1727204194.77902: variable 'ansible_search_path' from source: unknown 41016 1727204194.77935: calling self._execute() 41016 1727204194.78005: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204194.78010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204194.78021: variable 'omit' from source: magic vars 41016 1727204194.78305: variable 'ansible_distribution_major_version' from source: facts 41016 1727204194.78317: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204194.78401: variable '__network_wireless_connections_defined' from source: role '' defaults 41016 1727204194.78544: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41016 1727204194.80038: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41016 1727204194.80090: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41016 1727204194.80119: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41016 1727204194.80147: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41016 1727204194.80168: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41016 1727204194.80231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204194.80254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204194.80271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204194.80299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204194.80311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204194.80351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204194.80363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204194.80381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204194.80405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204194.80418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204194.80445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204194.80464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204194.80481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204194.80505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204194.80518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204194.80634: variable 'network_connections' from source: task vars 41016 1727204194.80644: variable 'interface0' from source: play vars 41016 1727204194.80698: variable 'interface0' from source: play vars 41016 1727204194.80706: variable 'interface0' from source: play vars 41016 1727204194.80751: variable 'interface0' from source: play vars 41016 1727204194.80761: variable 'interface1' from source: play vars 41016 1727204194.80807: variable 'interface1' from source: play vars 41016 1727204194.80815: variable 'interface1' from source: play vars 41016 1727204194.80856: variable 'interface1' from source: play vars 41016 1727204194.80911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41016 1727204194.81034: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41016 1727204194.81061: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41016 1727204194.81084: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41016 1727204194.81113: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41016 1727204194.81140: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41016 1727204194.81156: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41016 1727204194.81173: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204194.81192: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41016 1727204194.81242: variable '__network_team_connections_defined' from source: role '' defaults 41016 1727204194.81392: variable 'network_connections' from source: task vars 41016 1727204194.81395: variable 'interface0' from source: play vars 41016 1727204194.81441: variable 'interface0' from source: play vars 41016 1727204194.81446: variable 'interface0' from source: play vars 41016 1727204194.81488: variable 'interface0' from source: play vars 41016 1727204194.81497: variable 'interface1' from source: play vars 41016 1727204194.81546: variable 'interface1' from source: play vars 41016 1727204194.81557: variable 'interface1' from source: play vars 41016 1727204194.81593: variable 'interface1' from source: play vars 41016 1727204194.81620: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41016 1727204194.81624: when evaluation is False, skipping this task 41016 1727204194.81627: _execute() done 41016 1727204194.81629: dumping result to json 41016 1727204194.81631: done dumping result, returning 41016 1727204194.81638: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [028d2410-947f-12d5-0ec4-000000000022] 41016 1727204194.81642: sending task result for task 028d2410-947f-12d5-0ec4-000000000022 41016 1727204194.81729: done sending task result for task 028d2410-947f-12d5-0ec4-000000000022 41016 1727204194.81732: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41016 1727204194.81802: no more pending results, returning what we have 41016 1727204194.81806: results queue empty 41016 1727204194.81807: checking for any_errors_fatal 41016 1727204194.81813: done checking for any_errors_fatal 41016 1727204194.81813: checking for max_fail_percentage 41016 1727204194.81815: done checking for max_fail_percentage 41016 1727204194.81816: checking to see if all hosts have failed and the running result is not ok 41016 1727204194.81817: done checking to see if all hosts have failed 41016 1727204194.81817: getting the remaining hosts for this loop 41016 1727204194.81819: done getting the remaining hosts for this loop 41016 1727204194.81823: getting the next task for host managed-node1 41016 1727204194.81829: done getting next task for host managed-node1 41016 1727204194.81833: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 41016 1727204194.81836: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204194.81852: getting variables 41016 1727204194.81853: in VariableManager get_vars() 41016 1727204194.81893: Calling all_inventory to load vars for managed-node1 41016 1727204194.81895: Calling groups_inventory to load vars for managed-node1 41016 1727204194.81897: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204194.81907: Calling all_plugins_play to load vars for managed-node1 41016 1727204194.81909: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204194.81912: Calling groups_plugins_play to load vars for managed-node1 41016 1727204194.82694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204194.83550: done with get_vars() 41016 1727204194.83565: done getting variables 41016 1727204194.83608: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:56:34 -0400 (0:00:00.062) 0:00:18.512 ***** 41016 1727204194.83631: entering _queue_task() for managed-node1/package 41016 1727204194.83846: worker is 1 (out of 1 available) 41016 1727204194.83859: exiting _queue_task() for managed-node1/package 41016 1727204194.83871: done queuing things up, now waiting for results queue to drain 41016 1727204194.83872: waiting for pending results... 41016 1727204194.84040: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages 41016 1727204194.84128: in run() - task 028d2410-947f-12d5-0ec4-000000000023 41016 1727204194.84138: variable 'ansible_search_path' from source: unknown 41016 1727204194.84142: variable 'ansible_search_path' from source: unknown 41016 1727204194.84169: calling self._execute() 41016 1727204194.84245: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204194.84249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204194.84257: variable 'omit' from source: magic vars 41016 1727204194.84542: variable 'ansible_distribution_major_version' from source: facts 41016 1727204194.84545: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204194.84677: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41016 1727204194.84862: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41016 1727204194.84898: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41016 1727204194.84922: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41016 1727204194.84974: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41016 1727204194.85051: variable 'network_packages' from source: role '' defaults 41016 1727204194.85122: variable '__network_provider_setup' from source: role '' defaults 41016 1727204194.85130: variable '__network_service_name_default_nm' from source: role '' defaults 41016 1727204194.85179: variable '__network_service_name_default_nm' from source: role '' defaults 41016 1727204194.85186: variable '__network_packages_default_nm' from source: role '' defaults 41016 1727204194.85233: variable '__network_packages_default_nm' from source: role '' defaults 41016 1727204194.85345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41016 1727204194.86870: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41016 1727204194.86913: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41016 1727204194.86939: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41016 1727204194.86963: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41016 1727204194.86984: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41016 1727204194.87041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204194.87064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204194.87084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204194.87113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204194.87122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204194.87153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204194.87172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204194.87191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204194.87216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204194.87227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204194.87359: variable '__network_packages_default_gobject_packages' from source: role '' defaults 41016 1727204194.87432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204194.87459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204194.87478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204194.87507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204194.87519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204194.87579: variable 'ansible_python' from source: facts 41016 1727204194.87603: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 41016 1727204194.87653: variable '__network_wpa_supplicant_required' from source: role '' defaults 41016 1727204194.87715: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41016 1727204194.87788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204194.87804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204194.87825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204194.87849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204194.87859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204194.87892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204194.87914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204194.87933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204194.87957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204194.87967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204194.88062: variable 'network_connections' from source: task vars 41016 1727204194.88066: variable 'interface0' from source: play vars 41016 1727204194.88136: variable 'interface0' from source: play vars 41016 1727204194.88144: variable 'interface0' from source: play vars 41016 1727204194.88217: variable 'interface0' from source: play vars 41016 1727204194.88228: variable 'interface1' from source: play vars 41016 1727204194.88298: variable 'interface1' from source: play vars 41016 1727204194.88306: variable 'interface1' from source: play vars 41016 1727204194.88377: variable 'interface1' from source: play vars 41016 1727204194.88429: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41016 1727204194.88448: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41016 1727204194.88473: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204194.88494: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41016 1727204194.88529: variable '__network_wireless_connections_defined' from source: role '' defaults 41016 1727204194.88701: variable 'network_connections' from source: task vars 41016 1727204194.88705: variable 'interface0' from source: play vars 41016 1727204194.88772: variable 'interface0' from source: play vars 41016 1727204194.88781: variable 'interface0' from source: play vars 41016 1727204194.88850: variable 'interface0' from source: play vars 41016 1727204194.88860: variable 'interface1' from source: play vars 41016 1727204194.88931: variable 'interface1' from source: play vars 41016 1727204194.88938: variable 'interface1' from source: play vars 41016 1727204194.89005: variable 'interface1' from source: play vars 41016 1727204194.89047: variable '__network_packages_default_wireless' from source: role '' defaults 41016 1727204194.89101: variable '__network_wireless_connections_defined' from source: role '' defaults 41016 1727204194.89295: variable 'network_connections' from source: task vars 41016 1727204194.89298: variable 'interface0' from source: play vars 41016 1727204194.89349: variable 'interface0' from source: play vars 41016 1727204194.89353: variable 'interface0' from source: play vars 41016 1727204194.89395: variable 'interface0' from source: play vars 41016 1727204194.89404: variable 'interface1' from source: play vars 41016 1727204194.89450: variable 'interface1' from source: play vars 41016 1727204194.89453: variable 'interface1' from source: play vars 41016 1727204194.89500: variable 'interface1' from source: play vars 41016 1727204194.89522: variable '__network_packages_default_team' from source: role '' defaults 41016 1727204194.89574: variable '__network_team_connections_defined' from source: role '' defaults 41016 1727204194.89761: variable 'network_connections' from source: task vars 41016 1727204194.89765: variable 'interface0' from source: play vars 41016 1727204194.89815: variable 'interface0' from source: play vars 41016 1727204194.89819: variable 'interface0' from source: play vars 41016 1727204194.89862: variable 'interface0' from source: play vars 41016 1727204194.89871: variable 'interface1' from source: play vars 41016 1727204194.89921: variable 'interface1' from source: play vars 41016 1727204194.89927: variable 'interface1' from source: play vars 41016 1727204194.89970: variable 'interface1' from source: play vars 41016 1727204194.90018: variable '__network_service_name_default_initscripts' from source: role '' defaults 41016 1727204194.90059: variable '__network_service_name_default_initscripts' from source: role '' defaults 41016 1727204194.90065: variable '__network_packages_default_initscripts' from source: role '' defaults 41016 1727204194.90112: variable '__network_packages_default_initscripts' from source: role '' defaults 41016 1727204194.90243: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 41016 1727204194.90542: variable 'network_connections' from source: task vars 41016 1727204194.90545: variable 'interface0' from source: play vars 41016 1727204194.90591: variable 'interface0' from source: play vars 41016 1727204194.90597: variable 'interface0' from source: play vars 41016 1727204194.90638: variable 'interface0' from source: play vars 41016 1727204194.90647: variable 'interface1' from source: play vars 41016 1727204194.90691: variable 'interface1' from source: play vars 41016 1727204194.90697: variable 'interface1' from source: play vars 41016 1727204194.90738: variable 'interface1' from source: play vars 41016 1727204194.90747: variable 'ansible_distribution' from source: facts 41016 1727204194.90750: variable '__network_rh_distros' from source: role '' defaults 41016 1727204194.90755: variable 'ansible_distribution_major_version' from source: facts 41016 1727204194.90778: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 41016 1727204194.90877: variable 'ansible_distribution' from source: facts 41016 1727204194.90881: variable '__network_rh_distros' from source: role '' defaults 41016 1727204194.90883: variable 'ansible_distribution_major_version' from source: facts 41016 1727204194.90897: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 41016 1727204194.91003: variable 'ansible_distribution' from source: facts 41016 1727204194.91006: variable '__network_rh_distros' from source: role '' defaults 41016 1727204194.91011: variable 'ansible_distribution_major_version' from source: facts 41016 1727204194.91032: variable 'network_provider' from source: set_fact 41016 1727204194.91042: variable 'ansible_facts' from source: unknown 41016 1727204194.91399: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 41016 1727204194.91403: when evaluation is False, skipping this task 41016 1727204194.91405: _execute() done 41016 1727204194.91408: dumping result to json 41016 1727204194.91412: done dumping result, returning 41016 1727204194.91417: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages [028d2410-947f-12d5-0ec4-000000000023] 41016 1727204194.91422: sending task result for task 028d2410-947f-12d5-0ec4-000000000023 41016 1727204194.91508: done sending task result for task 028d2410-947f-12d5-0ec4-000000000023 41016 1727204194.91513: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 41016 1727204194.91583: no more pending results, returning what we have 41016 1727204194.91588: results queue empty 41016 1727204194.91593: checking for any_errors_fatal 41016 1727204194.91601: done checking for any_errors_fatal 41016 1727204194.91601: checking for max_fail_percentage 41016 1727204194.91603: done checking for max_fail_percentage 41016 1727204194.91604: checking to see if all hosts have failed and the running result is not ok 41016 1727204194.91604: done checking to see if all hosts have failed 41016 1727204194.91605: getting the remaining hosts for this loop 41016 1727204194.91606: done getting the remaining hosts for this loop 41016 1727204194.91613: getting the next task for host managed-node1 41016 1727204194.91620: done getting next task for host managed-node1 41016 1727204194.91624: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 41016 1727204194.91626: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204194.91639: getting variables 41016 1727204194.91641: in VariableManager get_vars() 41016 1727204194.91680: Calling all_inventory to load vars for managed-node1 41016 1727204194.91683: Calling groups_inventory to load vars for managed-node1 41016 1727204194.91685: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204194.91694: Calling all_plugins_play to load vars for managed-node1 41016 1727204194.91696: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204194.91698: Calling groups_plugins_play to load vars for managed-node1 41016 1727204194.92611: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204194.93482: done with get_vars() 41016 1727204194.93496: done getting variables 41016 1727204194.93541: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:56:34 -0400 (0:00:00.099) 0:00:18.611 ***** 41016 1727204194.93563: entering _queue_task() for managed-node1/package 41016 1727204194.93792: worker is 1 (out of 1 available) 41016 1727204194.93805: exiting _queue_task() for managed-node1/package 41016 1727204194.93818: done queuing things up, now waiting for results queue to drain 41016 1727204194.93820: waiting for pending results... 41016 1727204194.93987: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 41016 1727204194.94066: in run() - task 028d2410-947f-12d5-0ec4-000000000024 41016 1727204194.94078: variable 'ansible_search_path' from source: unknown 41016 1727204194.94081: variable 'ansible_search_path' from source: unknown 41016 1727204194.94111: calling self._execute() 41016 1727204194.94185: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204194.94189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204194.94198: variable 'omit' from source: magic vars 41016 1727204194.94469: variable 'ansible_distribution_major_version' from source: facts 41016 1727204194.94481: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204194.94560: variable 'network_state' from source: role '' defaults 41016 1727204194.94569: Evaluated conditional (network_state != {}): False 41016 1727204194.94572: when evaluation is False, skipping this task 41016 1727204194.94574: _execute() done 41016 1727204194.94578: dumping result to json 41016 1727204194.94581: done dumping result, returning 41016 1727204194.94589: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [028d2410-947f-12d5-0ec4-000000000024] 41016 1727204194.94594: sending task result for task 028d2410-947f-12d5-0ec4-000000000024 41016 1727204194.94682: done sending task result for task 028d2410-947f-12d5-0ec4-000000000024 41016 1727204194.94685: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41016 1727204194.94739: no more pending results, returning what we have 41016 1727204194.94743: results queue empty 41016 1727204194.94744: checking for any_errors_fatal 41016 1727204194.94749: done checking for any_errors_fatal 41016 1727204194.94750: checking for max_fail_percentage 41016 1727204194.94751: done checking for max_fail_percentage 41016 1727204194.94752: checking to see if all hosts have failed and the running result is not ok 41016 1727204194.94753: done checking to see if all hosts have failed 41016 1727204194.94754: getting the remaining hosts for this loop 41016 1727204194.94755: done getting the remaining hosts for this loop 41016 1727204194.94758: getting the next task for host managed-node1 41016 1727204194.94765: done getting next task for host managed-node1 41016 1727204194.94768: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 41016 1727204194.94771: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204194.94787: getting variables 41016 1727204194.94788: in VariableManager get_vars() 41016 1727204194.94823: Calling all_inventory to load vars for managed-node1 41016 1727204194.94825: Calling groups_inventory to load vars for managed-node1 41016 1727204194.94827: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204194.94836: Calling all_plugins_play to load vars for managed-node1 41016 1727204194.94838: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204194.94840: Calling groups_plugins_play to load vars for managed-node1 41016 1727204194.95590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204194.96454: done with get_vars() 41016 1727204194.96468: done getting variables 41016 1727204194.96514: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:56:34 -0400 (0:00:00.029) 0:00:18.641 ***** 41016 1727204194.96536: entering _queue_task() for managed-node1/package 41016 1727204194.96746: worker is 1 (out of 1 available) 41016 1727204194.96758: exiting _queue_task() for managed-node1/package 41016 1727204194.96769: done queuing things up, now waiting for results queue to drain 41016 1727204194.96770: waiting for pending results... 41016 1727204194.96938: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 41016 1727204194.97017: in run() - task 028d2410-947f-12d5-0ec4-000000000025 41016 1727204194.97030: variable 'ansible_search_path' from source: unknown 41016 1727204194.97034: variable 'ansible_search_path' from source: unknown 41016 1727204194.97062: calling self._execute() 41016 1727204194.97135: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204194.97139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204194.97148: variable 'omit' from source: magic vars 41016 1727204194.97416: variable 'ansible_distribution_major_version' from source: facts 41016 1727204194.97426: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204194.97507: variable 'network_state' from source: role '' defaults 41016 1727204194.97515: Evaluated conditional (network_state != {}): False 41016 1727204194.97518: when evaluation is False, skipping this task 41016 1727204194.97521: _execute() done 41016 1727204194.97523: dumping result to json 41016 1727204194.97527: done dumping result, returning 41016 1727204194.97534: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [028d2410-947f-12d5-0ec4-000000000025] 41016 1727204194.97545: sending task result for task 028d2410-947f-12d5-0ec4-000000000025 41016 1727204194.97627: done sending task result for task 028d2410-947f-12d5-0ec4-000000000025 41016 1727204194.97630: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41016 1727204194.97684: no more pending results, returning what we have 41016 1727204194.97689: results queue empty 41016 1727204194.97690: checking for any_errors_fatal 41016 1727204194.97698: done checking for any_errors_fatal 41016 1727204194.97699: checking for max_fail_percentage 41016 1727204194.97700: done checking for max_fail_percentage 41016 1727204194.97701: checking to see if all hosts have failed and the running result is not ok 41016 1727204194.97702: done checking to see if all hosts have failed 41016 1727204194.97702: getting the remaining hosts for this loop 41016 1727204194.97704: done getting the remaining hosts for this loop 41016 1727204194.97707: getting the next task for host managed-node1 41016 1727204194.97714: done getting next task for host managed-node1 41016 1727204194.97718: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 41016 1727204194.97720: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204194.97733: getting variables 41016 1727204194.97734: in VariableManager get_vars() 41016 1727204194.97766: Calling all_inventory to load vars for managed-node1 41016 1727204194.97768: Calling groups_inventory to load vars for managed-node1 41016 1727204194.97770: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204194.97782: Calling all_plugins_play to load vars for managed-node1 41016 1727204194.97789: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204194.97792: Calling groups_plugins_play to load vars for managed-node1 41016 1727204194.98643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204194.99494: done with get_vars() 41016 1727204194.99510: done getting variables 41016 1727204194.99580: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:56:34 -0400 (0:00:00.030) 0:00:18.672 ***** 41016 1727204194.99615: entering _queue_task() for managed-node1/service 41016 1727204194.99617: Creating lock for service 41016 1727204194.99881: worker is 1 (out of 1 available) 41016 1727204194.99891: exiting _queue_task() for managed-node1/service 41016 1727204194.99903: done queuing things up, now waiting for results queue to drain 41016 1727204194.99905: waiting for pending results... 41016 1727204195.00292: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 41016 1727204195.00320: in run() - task 028d2410-947f-12d5-0ec4-000000000026 41016 1727204195.00338: variable 'ansible_search_path' from source: unknown 41016 1727204195.00346: variable 'ansible_search_path' from source: unknown 41016 1727204195.00388: calling self._execute() 41016 1727204195.00484: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204195.00496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204195.00511: variable 'omit' from source: magic vars 41016 1727204195.00899: variable 'ansible_distribution_major_version' from source: facts 41016 1727204195.00915: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204195.01009: variable '__network_wireless_connections_defined' from source: role '' defaults 41016 1727204195.01185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41016 1727204195.02664: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41016 1727204195.02779: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41016 1727204195.02783: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41016 1727204195.02809: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41016 1727204195.02980: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41016 1727204195.02984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204195.02987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204195.02989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204195.03015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204195.03018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204195.03063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204195.03086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204195.03109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204195.03150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204195.03163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204195.03245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204195.03249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204195.03251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204195.03289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204195.03301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204195.03752: variable 'network_connections' from source: task vars 41016 1727204195.03755: variable 'interface0' from source: play vars 41016 1727204195.03757: variable 'interface0' from source: play vars 41016 1727204195.03759: variable 'interface0' from source: play vars 41016 1727204195.03761: variable 'interface0' from source: play vars 41016 1727204195.03763: variable 'interface1' from source: play vars 41016 1727204195.03765: variable 'interface1' from source: play vars 41016 1727204195.03767: variable 'interface1' from source: play vars 41016 1727204195.03769: variable 'interface1' from source: play vars 41016 1727204195.03771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41016 1727204195.03947: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41016 1727204195.03983: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41016 1727204195.04015: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41016 1727204195.04039: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41016 1727204195.04181: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41016 1727204195.04185: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41016 1727204195.04187: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204195.04189: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41016 1727204195.04199: variable '__network_team_connections_defined' from source: role '' defaults 41016 1727204195.04419: variable 'network_connections' from source: task vars 41016 1727204195.04422: variable 'interface0' from source: play vars 41016 1727204195.04483: variable 'interface0' from source: play vars 41016 1727204195.04489: variable 'interface0' from source: play vars 41016 1727204195.04546: variable 'interface0' from source: play vars 41016 1727204195.04558: variable 'interface1' from source: play vars 41016 1727204195.04616: variable 'interface1' from source: play vars 41016 1727204195.04628: variable 'interface1' from source: play vars 41016 1727204195.04679: variable 'interface1' from source: play vars 41016 1727204195.04715: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41016 1727204195.04718: when evaluation is False, skipping this task 41016 1727204195.04721: _execute() done 41016 1727204195.04724: dumping result to json 41016 1727204195.04726: done dumping result, returning 41016 1727204195.04735: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [028d2410-947f-12d5-0ec4-000000000026] 41016 1727204195.04739: sending task result for task 028d2410-947f-12d5-0ec4-000000000026 41016 1727204195.04827: done sending task result for task 028d2410-947f-12d5-0ec4-000000000026 41016 1727204195.04830: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41016 1727204195.04912: no more pending results, returning what we have 41016 1727204195.04916: results queue empty 41016 1727204195.04917: checking for any_errors_fatal 41016 1727204195.04924: done checking for any_errors_fatal 41016 1727204195.04924: checking for max_fail_percentage 41016 1727204195.04926: done checking for max_fail_percentage 41016 1727204195.04927: checking to see if all hosts have failed and the running result is not ok 41016 1727204195.04928: done checking to see if all hosts have failed 41016 1727204195.04928: getting the remaining hosts for this loop 41016 1727204195.04930: done getting the remaining hosts for this loop 41016 1727204195.04933: getting the next task for host managed-node1 41016 1727204195.04940: done getting next task for host managed-node1 41016 1727204195.04944: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 41016 1727204195.04946: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204195.04959: getting variables 41016 1727204195.04961: in VariableManager get_vars() 41016 1727204195.05004: Calling all_inventory to load vars for managed-node1 41016 1727204195.05006: Calling groups_inventory to load vars for managed-node1 41016 1727204195.05008: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204195.05017: Calling all_plugins_play to load vars for managed-node1 41016 1727204195.05019: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204195.05021: Calling groups_plugins_play to load vars for managed-node1 41016 1727204195.06648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204195.08489: done with get_vars() 41016 1727204195.08514: done getting variables 41016 1727204195.08572: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:56:35 -0400 (0:00:00.089) 0:00:18.762 ***** 41016 1727204195.08603: entering _queue_task() for managed-node1/service 41016 1727204195.08931: worker is 1 (out of 1 available) 41016 1727204195.08944: exiting _queue_task() for managed-node1/service 41016 1727204195.08957: done queuing things up, now waiting for results queue to drain 41016 1727204195.08958: waiting for pending results... 41016 1727204195.09594: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 41016 1727204195.09645: in run() - task 028d2410-947f-12d5-0ec4-000000000027 41016 1727204195.09702: variable 'ansible_search_path' from source: unknown 41016 1727204195.09882: variable 'ansible_search_path' from source: unknown 41016 1727204195.09886: calling self._execute() 41016 1727204195.09970: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204195.10036: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204195.10098: variable 'omit' from source: magic vars 41016 1727204195.10651: variable 'ansible_distribution_major_version' from source: facts 41016 1727204195.10668: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204195.10838: variable 'network_provider' from source: set_fact 41016 1727204195.10848: variable 'network_state' from source: role '' defaults 41016 1727204195.10863: Evaluated conditional (network_provider == "nm" or network_state != {}): True 41016 1727204195.10875: variable 'omit' from source: magic vars 41016 1727204195.10938: variable 'omit' from source: magic vars 41016 1727204195.10969: variable 'network_service_name' from source: role '' defaults 41016 1727204195.11049: variable 'network_service_name' from source: role '' defaults 41016 1727204195.11161: variable '__network_provider_setup' from source: role '' defaults 41016 1727204195.11171: variable '__network_service_name_default_nm' from source: role '' defaults 41016 1727204195.11242: variable '__network_service_name_default_nm' from source: role '' defaults 41016 1727204195.11256: variable '__network_packages_default_nm' from source: role '' defaults 41016 1727204195.11327: variable '__network_packages_default_nm' from source: role '' defaults 41016 1727204195.11653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41016 1727204195.13742: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41016 1727204195.13831: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41016 1727204195.13870: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41016 1727204195.13913: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41016 1727204195.13948: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41016 1727204195.14029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204195.14065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204195.14093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204195.14137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204195.14160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204195.14208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204195.14238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204195.14266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204195.14311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204195.14330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204195.14557: variable '__network_packages_default_gobject_packages' from source: role '' defaults 41016 1727204195.14680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204195.14779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204195.14782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204195.14785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204195.14786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204195.14874: variable 'ansible_python' from source: facts 41016 1727204195.14907: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 41016 1727204195.14993: variable '__network_wpa_supplicant_required' from source: role '' defaults 41016 1727204195.15070: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41016 1727204195.15202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204195.15238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204195.15268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204195.15315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204195.15337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204195.15391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204195.15453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204195.15460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204195.15506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204195.15529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204195.15779: variable 'network_connections' from source: task vars 41016 1727204195.15782: variable 'interface0' from source: play vars 41016 1727204195.15784: variable 'interface0' from source: play vars 41016 1727204195.15786: variable 'interface0' from source: play vars 41016 1727204195.15851: variable 'interface0' from source: play vars 41016 1727204195.15887: variable 'interface1' from source: play vars 41016 1727204195.15966: variable 'interface1' from source: play vars 41016 1727204195.15983: variable 'interface1' from source: play vars 41016 1727204195.16060: variable 'interface1' from source: play vars 41016 1727204195.16188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41016 1727204195.16400: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41016 1727204195.16460: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41016 1727204195.16511: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41016 1727204195.16560: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41016 1727204195.16624: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41016 1727204195.16662: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41016 1727204195.16708: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204195.16770: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41016 1727204195.16799: variable '__network_wireless_connections_defined' from source: role '' defaults 41016 1727204195.17075: variable 'network_connections' from source: task vars 41016 1727204195.17092: variable 'interface0' from source: play vars 41016 1727204195.17184: variable 'interface0' from source: play vars 41016 1727204195.17187: variable 'interface0' from source: play vars 41016 1727204195.17262: variable 'interface0' from source: play vars 41016 1727204195.17583: variable 'interface1' from source: play vars 41016 1727204195.17586: variable 'interface1' from source: play vars 41016 1727204195.17601: variable 'interface1' from source: play vars 41016 1727204195.17674: variable 'interface1' from source: play vars 41016 1727204195.17883: variable '__network_packages_default_wireless' from source: role '' defaults 41016 1727204195.18048: variable '__network_wireless_connections_defined' from source: role '' defaults 41016 1727204195.18717: variable 'network_connections' from source: task vars 41016 1727204195.18788: variable 'interface0' from source: play vars 41016 1727204195.18860: variable 'interface0' from source: play vars 41016 1727204195.19081: variable 'interface0' from source: play vars 41016 1727204195.19084: variable 'interface0' from source: play vars 41016 1727204195.19092: variable 'interface1' from source: play vars 41016 1727204195.19169: variable 'interface1' from source: play vars 41016 1727204195.19290: variable 'interface1' from source: play vars 41016 1727204195.19369: variable 'interface1' from source: play vars 41016 1727204195.19464: variable '__network_packages_default_team' from source: role '' defaults 41016 1727204195.19616: variable '__network_team_connections_defined' from source: role '' defaults 41016 1727204195.20279: variable 'network_connections' from source: task vars 41016 1727204195.20307: variable 'interface0' from source: play vars 41016 1727204195.20408: variable 'interface0' from source: play vars 41016 1727204195.20490: variable 'interface0' from source: play vars 41016 1727204195.20638: variable 'interface0' from source: play vars 41016 1727204195.20781: variable 'interface1' from source: play vars 41016 1727204195.20824: variable 'interface1' from source: play vars 41016 1727204195.20835: variable 'interface1' from source: play vars 41016 1727204195.20949: variable 'interface1' from source: play vars 41016 1727204195.21143: variable '__network_service_name_default_initscripts' from source: role '' defaults 41016 1727204195.21383: variable '__network_service_name_default_initscripts' from source: role '' defaults 41016 1727204195.21386: variable '__network_packages_default_initscripts' from source: role '' defaults 41016 1727204195.21415: variable '__network_packages_default_initscripts' from source: role '' defaults 41016 1727204195.21986: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 41016 1727204195.23160: variable 'network_connections' from source: task vars 41016 1727204195.23163: variable 'interface0' from source: play vars 41016 1727204195.23165: variable 'interface0' from source: play vars 41016 1727204195.23167: variable 'interface0' from source: play vars 41016 1727204195.23326: variable 'interface0' from source: play vars 41016 1727204195.23346: variable 'interface1' from source: play vars 41016 1727204195.23524: variable 'interface1' from source: play vars 41016 1727204195.23536: variable 'interface1' from source: play vars 41016 1727204195.23780: variable 'interface1' from source: play vars 41016 1727204195.23783: variable 'ansible_distribution' from source: facts 41016 1727204195.23785: variable '__network_rh_distros' from source: role '' defaults 41016 1727204195.23787: variable 'ansible_distribution_major_version' from source: facts 41016 1727204195.23789: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 41016 1727204195.24143: variable 'ansible_distribution' from source: facts 41016 1727204195.24153: variable '__network_rh_distros' from source: role '' defaults 41016 1727204195.24163: variable 'ansible_distribution_major_version' from source: facts 41016 1727204195.24184: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 41016 1727204195.24527: variable 'ansible_distribution' from source: facts 41016 1727204195.24572: variable '__network_rh_distros' from source: role '' defaults 41016 1727204195.24586: variable 'ansible_distribution_major_version' from source: facts 41016 1727204195.24714: variable 'network_provider' from source: set_fact 41016 1727204195.24739: variable 'omit' from source: magic vars 41016 1727204195.24767: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204195.24813: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204195.24914: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204195.24938: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204195.24960: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204195.25027: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204195.25107: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204195.25113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204195.25325: Set connection var ansible_shell_executable to /bin/sh 41016 1727204195.25328: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204195.25331: Set connection var ansible_shell_type to sh 41016 1727204195.25337: Set connection var ansible_timeout to 10 41016 1727204195.25347: Set connection var ansible_pipelining to False 41016 1727204195.25357: Set connection var ansible_connection to ssh 41016 1727204195.25386: variable 'ansible_shell_executable' from source: unknown 41016 1727204195.25439: variable 'ansible_connection' from source: unknown 41016 1727204195.25447: variable 'ansible_module_compression' from source: unknown 41016 1727204195.25453: variable 'ansible_shell_type' from source: unknown 41016 1727204195.25486: variable 'ansible_shell_executable' from source: unknown 41016 1727204195.25494: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204195.25502: variable 'ansible_pipelining' from source: unknown 41016 1727204195.25514: variable 'ansible_timeout' from source: unknown 41016 1727204195.25524: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204195.25905: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204195.25991: variable 'omit' from source: magic vars 41016 1727204195.25994: starting attempt loop 41016 1727204195.25996: running the handler 41016 1727204195.26025: variable 'ansible_facts' from source: unknown 41016 1727204195.27692: _low_level_execute_command(): starting 41016 1727204195.27704: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204195.28973: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204195.28994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204195.29005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204195.29138: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204195.29159: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204195.29267: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204195.31062: stdout chunk (state=3): >>>/root <<< 41016 1727204195.31190: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204195.31205: stderr chunk (state=3): >>><<< 41016 1727204195.31214: stdout chunk (state=3): >>><<< 41016 1727204195.31239: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204195.31251: _low_level_execute_command(): starting 41016 1727204195.31258: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204195.3123918-42626-147377005602979 `" && echo ansible-tmp-1727204195.3123918-42626-147377005602979="` echo /root/.ansible/tmp/ansible-tmp-1727204195.3123918-42626-147377005602979 `" ) && sleep 0' 41016 1727204195.31879: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204195.31895: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204195.31933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204195.31948: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41016 1727204195.31958: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 41016 1727204195.31993: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204195.32065: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204195.32085: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204195.32115: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204195.32219: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204195.34321: stdout chunk (state=3): >>>ansible-tmp-1727204195.3123918-42626-147377005602979=/root/.ansible/tmp/ansible-tmp-1727204195.3123918-42626-147377005602979 <<< 41016 1727204195.34493: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204195.34496: stdout chunk (state=3): >>><<< 41016 1727204195.34499: stderr chunk (state=3): >>><<< 41016 1727204195.34681: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204195.3123918-42626-147377005602979=/root/.ansible/tmp/ansible-tmp-1727204195.3123918-42626-147377005602979 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204195.34690: variable 'ansible_module_compression' from source: unknown 41016 1727204195.34694: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 41016 1727204195.34697: ANSIBALLZ: Acquiring lock 41016 1727204195.34699: ANSIBALLZ: Lock acquired: 140580610774160 41016 1727204195.34701: ANSIBALLZ: Creating module 41016 1727204195.68299: ANSIBALLZ: Writing module into payload 41016 1727204195.68466: ANSIBALLZ: Writing module 41016 1727204195.68500: ANSIBALLZ: Renaming module 41016 1727204195.68507: ANSIBALLZ: Done creating module 41016 1727204195.68540: variable 'ansible_facts' from source: unknown 41016 1727204195.68759: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204195.3123918-42626-147377005602979/AnsiballZ_systemd.py 41016 1727204195.69001: Sending initial data 41016 1727204195.69004: Sent initial data (156 bytes) 41016 1727204195.69596: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204195.69611: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204195.69715: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204195.69740: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204195.69859: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204195.71600: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 41016 1727204195.71635: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204195.71721: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204195.71814: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmps499xfed /root/.ansible/tmp/ansible-tmp-1727204195.3123918-42626-147377005602979/AnsiballZ_systemd.py <<< 41016 1727204195.71818: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204195.3123918-42626-147377005602979/AnsiballZ_systemd.py" <<< 41016 1727204195.71895: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmps499xfed" to remote "/root/.ansible/tmp/ansible-tmp-1727204195.3123918-42626-147377005602979/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204195.3123918-42626-147377005602979/AnsiballZ_systemd.py" <<< 41016 1727204195.73608: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204195.73764: stderr chunk (state=3): >>><<< 41016 1727204195.73767: stdout chunk (state=3): >>><<< 41016 1727204195.73769: done transferring module to remote 41016 1727204195.73771: _low_level_execute_command(): starting 41016 1727204195.73773: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204195.3123918-42626-147377005602979/ /root/.ansible/tmp/ansible-tmp-1727204195.3123918-42626-147377005602979/AnsiballZ_systemd.py && sleep 0' 41016 1727204195.74378: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204195.74420: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204195.74499: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204195.76466: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204195.76480: stdout chunk (state=3): >>><<< 41016 1727204195.76556: stderr chunk (state=3): >>><<< 41016 1727204195.76559: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204195.76562: _low_level_execute_command(): starting 41016 1727204195.76565: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204195.3123918-42626-147377005602979/AnsiballZ_systemd.py && sleep 0' 41016 1727204195.76921: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204195.76934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204195.76950: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204195.76992: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204195.77006: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204195.77099: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204196.08585: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainStartTimestampMonotonic": "33322039", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainHandoffTimestampMonotonic": "33336258", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10747904", "MemoryPeak": "13869056", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3297079296", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1558553000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 41016 1727204196.08592: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target shutdown.target multi-user.target", "After": "network-pre.target sysinit.target system.slice basic.target dbus.socket systemd-journald.socket cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:22 EDT", "StateChangeTimestampMonotonic": "413618667", "InactiveExitTimestamp": "Tue 2024-09-24 14:44:02 EDT", "InactiveExitTimestampMonotonic": "33322542", "ActiveEnterTimestamp": "Tue 2024-09-24 14:44:03 EDT", "ActiveEnterTimestampMonotonic": "34680535", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ConditionTimestampMonotonic": "33321151", "AssertTimestamp": "Tue 2024-09-24 14:44:02 EDT", "AssertTimestampMonotonic": "33321155", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "53c91cc8356748b484feba73dc5ee144", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 41016 1727204196.10901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 41016 1727204196.10982: stderr chunk (state=3): >>><<< 41016 1727204196.10987: stdout chunk (state=3): >>><<< 41016 1727204196.10990: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainStartTimestampMonotonic": "33322039", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainHandoffTimestampMonotonic": "33336258", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10747904", "MemoryPeak": "13869056", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3297079296", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1558553000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target shutdown.target multi-user.target", "After": "network-pre.target sysinit.target system.slice basic.target dbus.socket systemd-journald.socket cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:22 EDT", "StateChangeTimestampMonotonic": "413618667", "InactiveExitTimestamp": "Tue 2024-09-24 14:44:02 EDT", "InactiveExitTimestampMonotonic": "33322542", "ActiveEnterTimestamp": "Tue 2024-09-24 14:44:03 EDT", "ActiveEnterTimestampMonotonic": "34680535", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ConditionTimestampMonotonic": "33321151", "AssertTimestamp": "Tue 2024-09-24 14:44:02 EDT", "AssertTimestampMonotonic": "33321155", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "53c91cc8356748b484feba73dc5ee144", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204196.11178: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204195.3123918-42626-147377005602979/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204196.11224: _low_level_execute_command(): starting 41016 1727204196.11227: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204195.3123918-42626-147377005602979/ > /dev/null 2>&1 && sleep 0' 41016 1727204196.11973: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204196.11999: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204196.12125: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204196.14090: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204196.14123: stderr chunk (state=3): >>><<< 41016 1727204196.14127: stdout chunk (state=3): >>><<< 41016 1727204196.14140: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204196.14146: handler run complete 41016 1727204196.14185: attempt loop complete, returning result 41016 1727204196.14188: _execute() done 41016 1727204196.14190: dumping result to json 41016 1727204196.14202: done dumping result, returning 41016 1727204196.14212: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [028d2410-947f-12d5-0ec4-000000000027] 41016 1727204196.14215: sending task result for task 028d2410-947f-12d5-0ec4-000000000027 41016 1727204196.14447: done sending task result for task 028d2410-947f-12d5-0ec4-000000000027 41016 1727204196.14450: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41016 1727204196.14504: no more pending results, returning what we have 41016 1727204196.14508: results queue empty 41016 1727204196.14511: checking for any_errors_fatal 41016 1727204196.14520: done checking for any_errors_fatal 41016 1727204196.14521: checking for max_fail_percentage 41016 1727204196.14523: done checking for max_fail_percentage 41016 1727204196.14523: checking to see if all hosts have failed and the running result is not ok 41016 1727204196.14524: done checking to see if all hosts have failed 41016 1727204196.14525: getting the remaining hosts for this loop 41016 1727204196.14526: done getting the remaining hosts for this loop 41016 1727204196.14530: getting the next task for host managed-node1 41016 1727204196.14536: done getting next task for host managed-node1 41016 1727204196.14539: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 41016 1727204196.14542: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204196.14552: getting variables 41016 1727204196.14553: in VariableManager get_vars() 41016 1727204196.14598: Calling all_inventory to load vars for managed-node1 41016 1727204196.14600: Calling groups_inventory to load vars for managed-node1 41016 1727204196.14603: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204196.14614: Calling all_plugins_play to load vars for managed-node1 41016 1727204196.14616: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204196.14619: Calling groups_plugins_play to load vars for managed-node1 41016 1727204196.15466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204196.16740: done with get_vars() 41016 1727204196.16757: done getting variables 41016 1727204196.16801: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:56:36 -0400 (0:00:01.082) 0:00:19.844 ***** 41016 1727204196.16824: entering _queue_task() for managed-node1/service 41016 1727204196.17057: worker is 1 (out of 1 available) 41016 1727204196.17070: exiting _queue_task() for managed-node1/service 41016 1727204196.17083: done queuing things up, now waiting for results queue to drain 41016 1727204196.17084: waiting for pending results... 41016 1727204196.17262: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 41016 1727204196.17349: in run() - task 028d2410-947f-12d5-0ec4-000000000028 41016 1727204196.17360: variable 'ansible_search_path' from source: unknown 41016 1727204196.17364: variable 'ansible_search_path' from source: unknown 41016 1727204196.17393: calling self._execute() 41016 1727204196.17468: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204196.17473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204196.17483: variable 'omit' from source: magic vars 41016 1727204196.17761: variable 'ansible_distribution_major_version' from source: facts 41016 1727204196.17769: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204196.17853: variable 'network_provider' from source: set_fact 41016 1727204196.17857: Evaluated conditional (network_provider == "nm"): True 41016 1727204196.17924: variable '__network_wpa_supplicant_required' from source: role '' defaults 41016 1727204196.17989: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41016 1727204196.18105: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41016 1727204196.20307: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41016 1727204196.20354: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41016 1727204196.20382: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41016 1727204196.20407: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41016 1727204196.20429: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41016 1727204196.20491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204196.20511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204196.20530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204196.20559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204196.20570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204196.20604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204196.20624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204196.20640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204196.20667: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204196.20679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204196.20708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204196.20725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204196.20741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204196.20770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204196.20778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204196.20866: variable 'network_connections' from source: task vars 41016 1727204196.20878: variable 'interface0' from source: play vars 41016 1727204196.20931: variable 'interface0' from source: play vars 41016 1727204196.20938: variable 'interface0' from source: play vars 41016 1727204196.20983: variable 'interface0' from source: play vars 41016 1727204196.20992: variable 'interface1' from source: play vars 41016 1727204196.21038: variable 'interface1' from source: play vars 41016 1727204196.21043: variable 'interface1' from source: play vars 41016 1727204196.21086: variable 'interface1' from source: play vars 41016 1727204196.21152: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41016 1727204196.21262: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41016 1727204196.21289: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41016 1727204196.21310: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41016 1727204196.21335: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41016 1727204196.21364: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41016 1727204196.21381: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41016 1727204196.21398: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204196.21418: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41016 1727204196.21456: variable '__network_wireless_connections_defined' from source: role '' defaults 41016 1727204196.21639: variable 'network_connections' from source: task vars 41016 1727204196.21644: variable 'interface0' from source: play vars 41016 1727204196.21708: variable 'interface0' from source: play vars 41016 1727204196.21711: variable 'interface0' from source: play vars 41016 1727204196.21783: variable 'interface0' from source: play vars 41016 1727204196.21786: variable 'interface1' from source: play vars 41016 1727204196.21981: variable 'interface1' from source: play vars 41016 1727204196.21984: variable 'interface1' from source: play vars 41016 1727204196.21987: variable 'interface1' from source: play vars 41016 1727204196.21989: Evaluated conditional (__network_wpa_supplicant_required): False 41016 1727204196.21991: when evaluation is False, skipping this task 41016 1727204196.21993: _execute() done 41016 1727204196.21995: dumping result to json 41016 1727204196.21996: done dumping result, returning 41016 1727204196.21998: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [028d2410-947f-12d5-0ec4-000000000028] 41016 1727204196.22000: sending task result for task 028d2410-947f-12d5-0ec4-000000000028 41016 1727204196.22061: done sending task result for task 028d2410-947f-12d5-0ec4-000000000028 41016 1727204196.22064: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 41016 1727204196.22114: no more pending results, returning what we have 41016 1727204196.22118: results queue empty 41016 1727204196.22119: checking for any_errors_fatal 41016 1727204196.22137: done checking for any_errors_fatal 41016 1727204196.22138: checking for max_fail_percentage 41016 1727204196.22139: done checking for max_fail_percentage 41016 1727204196.22140: checking to see if all hosts have failed and the running result is not ok 41016 1727204196.22141: done checking to see if all hosts have failed 41016 1727204196.22142: getting the remaining hosts for this loop 41016 1727204196.22143: done getting the remaining hosts for this loop 41016 1727204196.22147: getting the next task for host managed-node1 41016 1727204196.22154: done getting next task for host managed-node1 41016 1727204196.22158: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 41016 1727204196.22161: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204196.22183: getting variables 41016 1727204196.22185: in VariableManager get_vars() 41016 1727204196.22224: Calling all_inventory to load vars for managed-node1 41016 1727204196.22227: Calling groups_inventory to load vars for managed-node1 41016 1727204196.22229: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204196.22239: Calling all_plugins_play to load vars for managed-node1 41016 1727204196.22241: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204196.22243: Calling groups_plugins_play to load vars for managed-node1 41016 1727204196.23635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204196.25140: done with get_vars() 41016 1727204196.25164: done getting variables 41016 1727204196.25231: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:56:36 -0400 (0:00:00.084) 0:00:19.928 ***** 41016 1727204196.25266: entering _queue_task() for managed-node1/service 41016 1727204196.25598: worker is 1 (out of 1 available) 41016 1727204196.25613: exiting _queue_task() for managed-node1/service 41016 1727204196.25625: done queuing things up, now waiting for results queue to drain 41016 1727204196.25626: waiting for pending results... 41016 1727204196.25909: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service 41016 1727204196.26011: in run() - task 028d2410-947f-12d5-0ec4-000000000029 41016 1727204196.26027: variable 'ansible_search_path' from source: unknown 41016 1727204196.26030: variable 'ansible_search_path' from source: unknown 41016 1727204196.26058: calling self._execute() 41016 1727204196.26132: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204196.26136: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204196.26150: variable 'omit' from source: magic vars 41016 1727204196.26424: variable 'ansible_distribution_major_version' from source: facts 41016 1727204196.26433: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204196.26514: variable 'network_provider' from source: set_fact 41016 1727204196.26521: Evaluated conditional (network_provider == "initscripts"): False 41016 1727204196.26524: when evaluation is False, skipping this task 41016 1727204196.26527: _execute() done 41016 1727204196.26529: dumping result to json 41016 1727204196.26531: done dumping result, returning 41016 1727204196.26547: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service [028d2410-947f-12d5-0ec4-000000000029] 41016 1727204196.26722: sending task result for task 028d2410-947f-12d5-0ec4-000000000029 41016 1727204196.26790: done sending task result for task 028d2410-947f-12d5-0ec4-000000000029 41016 1727204196.26794: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41016 1727204196.26829: no more pending results, returning what we have 41016 1727204196.26832: results queue empty 41016 1727204196.26833: checking for any_errors_fatal 41016 1727204196.26838: done checking for any_errors_fatal 41016 1727204196.26839: checking for max_fail_percentage 41016 1727204196.26840: done checking for max_fail_percentage 41016 1727204196.26841: checking to see if all hosts have failed and the running result is not ok 41016 1727204196.26842: done checking to see if all hosts have failed 41016 1727204196.26843: getting the remaining hosts for this loop 41016 1727204196.26844: done getting the remaining hosts for this loop 41016 1727204196.26847: getting the next task for host managed-node1 41016 1727204196.26853: done getting next task for host managed-node1 41016 1727204196.26856: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 41016 1727204196.26859: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204196.26872: getting variables 41016 1727204196.26874: in VariableManager get_vars() 41016 1727204196.26912: Calling all_inventory to load vars for managed-node1 41016 1727204196.26915: Calling groups_inventory to load vars for managed-node1 41016 1727204196.26917: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204196.26926: Calling all_plugins_play to load vars for managed-node1 41016 1727204196.26928: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204196.26931: Calling groups_plugins_play to load vars for managed-node1 41016 1727204196.28183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204196.29832: done with get_vars() 41016 1727204196.29855: done getting variables 41016 1727204196.29923: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:56:36 -0400 (0:00:00.046) 0:00:19.975 ***** 41016 1727204196.29955: entering _queue_task() for managed-node1/copy 41016 1727204196.30282: worker is 1 (out of 1 available) 41016 1727204196.30293: exiting _queue_task() for managed-node1/copy 41016 1727204196.30306: done queuing things up, now waiting for results queue to drain 41016 1727204196.30307: waiting for pending results... 41016 1727204196.30695: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 41016 1727204196.30738: in run() - task 028d2410-947f-12d5-0ec4-00000000002a 41016 1727204196.30757: variable 'ansible_search_path' from source: unknown 41016 1727204196.30765: variable 'ansible_search_path' from source: unknown 41016 1727204196.30812: calling self._execute() 41016 1727204196.30907: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204196.30921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204196.30935: variable 'omit' from source: magic vars 41016 1727204196.31319: variable 'ansible_distribution_major_version' from source: facts 41016 1727204196.31339: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204196.31463: variable 'network_provider' from source: set_fact 41016 1727204196.31473: Evaluated conditional (network_provider == "initscripts"): False 41016 1727204196.31482: when evaluation is False, skipping this task 41016 1727204196.31491: _execute() done 41016 1727204196.31499: dumping result to json 41016 1727204196.31508: done dumping result, returning 41016 1727204196.31523: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [028d2410-947f-12d5-0ec4-00000000002a] 41016 1727204196.31533: sending task result for task 028d2410-947f-12d5-0ec4-00000000002a skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 41016 1727204196.31702: no more pending results, returning what we have 41016 1727204196.31707: results queue empty 41016 1727204196.31708: checking for any_errors_fatal 41016 1727204196.31716: done checking for any_errors_fatal 41016 1727204196.31717: checking for max_fail_percentage 41016 1727204196.31719: done checking for max_fail_percentage 41016 1727204196.31720: checking to see if all hosts have failed and the running result is not ok 41016 1727204196.31721: done checking to see if all hosts have failed 41016 1727204196.31722: getting the remaining hosts for this loop 41016 1727204196.31723: done getting the remaining hosts for this loop 41016 1727204196.31727: getting the next task for host managed-node1 41016 1727204196.31734: done getting next task for host managed-node1 41016 1727204196.31739: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 41016 1727204196.31742: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204196.31758: getting variables 41016 1727204196.31760: in VariableManager get_vars() 41016 1727204196.31802: Calling all_inventory to load vars for managed-node1 41016 1727204196.31805: Calling groups_inventory to load vars for managed-node1 41016 1727204196.31808: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204196.31820: Calling all_plugins_play to load vars for managed-node1 41016 1727204196.31823: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204196.31826: Calling groups_plugins_play to load vars for managed-node1 41016 1727204196.32490: done sending task result for task 028d2410-947f-12d5-0ec4-00000000002a 41016 1727204196.32494: WORKER PROCESS EXITING 41016 1727204196.33408: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204196.34900: done with get_vars() 41016 1727204196.34925: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:56:36 -0400 (0:00:00.050) 0:00:20.026 ***** 41016 1727204196.35011: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 41016 1727204196.35013: Creating lock for fedora.linux_system_roles.network_connections 41016 1727204196.35340: worker is 1 (out of 1 available) 41016 1727204196.35353: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 41016 1727204196.35365: done queuing things up, now waiting for results queue to drain 41016 1727204196.35367: waiting for pending results... 41016 1727204196.35653: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 41016 1727204196.35799: in run() - task 028d2410-947f-12d5-0ec4-00000000002b 41016 1727204196.35821: variable 'ansible_search_path' from source: unknown 41016 1727204196.35828: variable 'ansible_search_path' from source: unknown 41016 1727204196.35866: calling self._execute() 41016 1727204196.35962: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204196.35974: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204196.35990: variable 'omit' from source: magic vars 41016 1727204196.36358: variable 'ansible_distribution_major_version' from source: facts 41016 1727204196.36373: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204196.36386: variable 'omit' from source: magic vars 41016 1727204196.36439: variable 'omit' from source: magic vars 41016 1727204196.36602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41016 1727204196.38632: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41016 1727204196.38706: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41016 1727204196.38781: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41016 1727204196.38795: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41016 1727204196.38827: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41016 1727204196.38982: variable 'network_provider' from source: set_fact 41016 1727204196.39046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204196.39099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204196.39130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204196.39178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204196.39203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204196.39277: variable 'omit' from source: magic vars 41016 1727204196.39393: variable 'omit' from source: magic vars 41016 1727204196.39499: variable 'network_connections' from source: task vars 41016 1727204196.39519: variable 'interface0' from source: play vars 41016 1727204196.39589: variable 'interface0' from source: play vars 41016 1727204196.39634: variable 'interface0' from source: play vars 41016 1727204196.39668: variable 'interface0' from source: play vars 41016 1727204196.39688: variable 'interface1' from source: play vars 41016 1727204196.39754: variable 'interface1' from source: play vars 41016 1727204196.39765: variable 'interface1' from source: play vars 41016 1727204196.39853: variable 'interface1' from source: play vars 41016 1727204196.40042: variable 'omit' from source: magic vars 41016 1727204196.40054: variable '__lsr_ansible_managed' from source: task vars 41016 1727204196.40113: variable '__lsr_ansible_managed' from source: task vars 41016 1727204196.40689: Loaded config def from plugin (lookup/template) 41016 1727204196.40721: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 41016 1727204196.40735: File lookup term: get_ansible_managed.j2 41016 1727204196.40742: variable 'ansible_search_path' from source: unknown 41016 1727204196.40750: evaluation_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 41016 1727204196.40766: search_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 41016 1727204196.40829: variable 'ansible_search_path' from source: unknown 41016 1727204196.46588: variable 'ansible_managed' from source: unknown 41016 1727204196.46722: variable 'omit' from source: magic vars 41016 1727204196.46753: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204196.46787: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204196.46812: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204196.46933: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204196.46936: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204196.46938: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204196.46940: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204196.46941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204196.46980: Set connection var ansible_shell_executable to /bin/sh 41016 1727204196.46991: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204196.47000: Set connection var ansible_shell_type to sh 41016 1727204196.47007: Set connection var ansible_timeout to 10 41016 1727204196.47015: Set connection var ansible_pipelining to False 41016 1727204196.47024: Set connection var ansible_connection to ssh 41016 1727204196.47052: variable 'ansible_shell_executable' from source: unknown 41016 1727204196.47058: variable 'ansible_connection' from source: unknown 41016 1727204196.47063: variable 'ansible_module_compression' from source: unknown 41016 1727204196.47068: variable 'ansible_shell_type' from source: unknown 41016 1727204196.47073: variable 'ansible_shell_executable' from source: unknown 41016 1727204196.47081: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204196.47088: variable 'ansible_pipelining' from source: unknown 41016 1727204196.47094: variable 'ansible_timeout' from source: unknown 41016 1727204196.47112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204196.47236: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 41016 1727204196.47250: variable 'omit' from source: magic vars 41016 1727204196.47270: starting attempt loop 41016 1727204196.47279: running the handler 41016 1727204196.47296: _low_level_execute_command(): starting 41016 1727204196.47369: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204196.48006: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204196.48094: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204196.48133: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204196.48149: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204196.48173: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204196.48294: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204196.50089: stdout chunk (state=3): >>>/root <<< 41016 1727204196.50233: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204196.50253: stdout chunk (state=3): >>><<< 41016 1727204196.50272: stderr chunk (state=3): >>><<< 41016 1727204196.50385: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204196.50389: _low_level_execute_command(): starting 41016 1727204196.50392: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204196.50295-42671-275408712133527 `" && echo ansible-tmp-1727204196.50295-42671-275408712133527="` echo /root/.ansible/tmp/ansible-tmp-1727204196.50295-42671-275408712133527 `" ) && sleep 0' 41016 1727204196.50971: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204196.50991: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204196.51006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204196.51118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204196.51122: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204196.51147: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204196.51255: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204196.53568: stdout chunk (state=3): >>>ansible-tmp-1727204196.50295-42671-275408712133527=/root/.ansible/tmp/ansible-tmp-1727204196.50295-42671-275408712133527 <<< 41016 1727204196.53572: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204196.53577: stdout chunk (state=3): >>><<< 41016 1727204196.53580: stderr chunk (state=3): >>><<< 41016 1727204196.53583: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204196.50295-42671-275408712133527=/root/.ansible/tmp/ansible-tmp-1727204196.50295-42671-275408712133527 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204196.53885: variable 'ansible_module_compression' from source: unknown 41016 1727204196.53889: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 41016 1727204196.53892: ANSIBALLZ: Acquiring lock 41016 1727204196.53894: ANSIBALLZ: Lock acquired: 140580604781072 41016 1727204196.53896: ANSIBALLZ: Creating module 41016 1727204196.89458: ANSIBALLZ: Writing module into payload 41016 1727204196.89795: ANSIBALLZ: Writing module 41016 1727204196.89822: ANSIBALLZ: Renaming module 41016 1727204196.89834: ANSIBALLZ: Done creating module 41016 1727204196.89864: variable 'ansible_facts' from source: unknown 41016 1727204196.89970: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204196.50295-42671-275408712133527/AnsiballZ_network_connections.py 41016 1727204196.90209: Sending initial data 41016 1727204196.90219: Sent initial data (166 bytes) 41016 1727204196.91197: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204196.91208: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204196.91398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204196.91531: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204196.91641: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204196.93340: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 41016 1727204196.93356: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 41016 1727204196.93370: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 41016 1727204196.93381: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 41016 1727204196.93391: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 41016 1727204196.93399: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 41016 1727204196.93405: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 41016 1727204196.93423: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 <<< 41016 1727204196.93436: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204196.93533: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204196.93627: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmpfmkm68av /root/.ansible/tmp/ansible-tmp-1727204196.50295-42671-275408712133527/AnsiballZ_network_connections.py <<< 41016 1727204196.93630: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204196.50295-42671-275408712133527/AnsiballZ_network_connections.py" <<< 41016 1727204196.93719: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmpfmkm68av" to remote "/root/.ansible/tmp/ansible-tmp-1727204196.50295-42671-275408712133527/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204196.50295-42671-275408712133527/AnsiballZ_network_connections.py" <<< 41016 1727204196.95210: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204196.95243: stderr chunk (state=3): >>><<< 41016 1727204196.95381: stdout chunk (state=3): >>><<< 41016 1727204196.95384: done transferring module to remote 41016 1727204196.95386: _low_level_execute_command(): starting 41016 1727204196.95388: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204196.50295-42671-275408712133527/ /root/.ansible/tmp/ansible-tmp-1727204196.50295-42671-275408712133527/AnsiballZ_network_connections.py && sleep 0' 41016 1727204196.96294: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204196.96305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 41016 1727204196.96308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204196.96310: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204196.96312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204196.96397: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204196.96401: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204196.96485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204196.98628: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204196.98632: stderr chunk (state=3): >>><<< 41016 1727204196.98635: stdout chunk (state=3): >>><<< 41016 1727204196.98637: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204196.98639: _low_level_execute_command(): starting 41016 1727204196.98641: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204196.50295-42671-275408712133527/AnsiballZ_network_connections.py && sleep 0' 41016 1727204196.99150: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204196.99166: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204196.99279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204196.99299: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204196.99421: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204197.65246: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 9a787842-0db5-45cc-82f4-1fb96e28cf45\n[006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, af2476db-1e3b-4f5e-ab84-23db91da8d4b\n[007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 9a787842-0db5-45cc-82f4-1fb96e28cf45 (not-active)\n[008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, af2476db-1e3b-4f5e-ab84-23db91da8d4b (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.3/24", "2001:db8::2/32"], "route": [{"network": "198.51.10.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4}, {"network": "2001:db6::4", "prefix": 128, "gateway": "2001:db8::1", "metric": 2}]}}, {"name": "ethtest1", "interface_name": "ethtest1", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.6/24", "2001:db8::4/32"], "route": [{"network": "198.51.12.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.3/24", "2001:db8::2/32"], "route": [{"network": "198.51.10.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4}, {"network": "2001:db6::4", "prefix": 128, "gateway": "2001:db8::1", "metric": 2}]}}, {"name": "ethtest1", "interface_name": "ethtest1", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.6/24", "2001:db8::4/32"], "route": [{"network": "198.51.12.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 41016 1727204197.67986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 41016 1727204197.67990: stdout chunk (state=3): >>><<< 41016 1727204197.67992: stderr chunk (state=3): >>><<< 41016 1727204197.67994: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 9a787842-0db5-45cc-82f4-1fb96e28cf45\n[006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, af2476db-1e3b-4f5e-ab84-23db91da8d4b\n[007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 9a787842-0db5-45cc-82f4-1fb96e28cf45 (not-active)\n[008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, af2476db-1e3b-4f5e-ab84-23db91da8d4b (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.3/24", "2001:db8::2/32"], "route": [{"network": "198.51.10.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4}, {"network": "2001:db6::4", "prefix": 128, "gateway": "2001:db8::1", "metric": 2}]}}, {"name": "ethtest1", "interface_name": "ethtest1", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.6/24", "2001:db8::4/32"], "route": [{"network": "198.51.12.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.3/24", "2001:db8::2/32"], "route": [{"network": "198.51.10.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4}, {"network": "2001:db6::4", "prefix": 128, "gateway": "2001:db8::1", "metric": 2}]}}, {"name": "ethtest1", "interface_name": "ethtest1", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.6/24", "2001:db8::4/32"], "route": [{"network": "198.51.12.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204197.67997: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'interface_name': 'ethtest0', 'state': 'up', 'type': 'ethernet', 'autoconnect': False, 'ip': {'address': ['198.51.100.3/24', '2001:db8::2/32'], 'route': [{'network': '198.51.10.64', 'prefix': 26, 'gateway': '198.51.100.6', 'metric': 4}, {'network': '2001:db6::4', 'prefix': 128, 'gateway': '2001:db8::1', 'metric': 2}]}}, {'name': 'ethtest1', 'interface_name': 'ethtest1', 'state': 'up', 'type': 'ethernet', 'autoconnect': False, 'ip': {'address': ['198.51.100.6/24', '2001:db8::4/32'], 'route': [{'network': '198.51.12.128', 'prefix': 26, 'gateway': '198.51.100.1', 'metric': 2}]}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204196.50295-42671-275408712133527/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204197.67999: _low_level_execute_command(): starting 41016 1727204197.68001: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204196.50295-42671-275408712133527/ > /dev/null 2>&1 && sleep 0' 41016 1727204197.68701: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204197.68727: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204197.68753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204197.68797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204197.68867: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204197.68902: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204197.68923: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204197.68946: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204197.69058: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204197.71156: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204197.71217: stderr chunk (state=3): >>><<< 41016 1727204197.71228: stdout chunk (state=3): >>><<< 41016 1727204197.71241: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204197.71248: handler run complete 41016 1727204197.71313: attempt loop complete, returning result 41016 1727204197.71316: _execute() done 41016 1727204197.71319: dumping result to json 41016 1727204197.71321: done dumping result, returning 41016 1727204197.71330: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [028d2410-947f-12d5-0ec4-00000000002b] 41016 1727204197.71332: sending task result for task 028d2410-947f-12d5-0ec4-00000000002b changed: [managed-node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/24", "2001:db8::2/32" ], "route": [ { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.10.64", "prefix": 26 }, { "gateway": "2001:db8::1", "metric": 2, "network": "2001:db6::4", "prefix": 128 } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" }, { "autoconnect": false, "interface_name": "ethtest1", "ip": { "address": [ "198.51.100.6/24", "2001:db8::4/32" ], "route": [ { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.12.128", "prefix": 26 } ] }, "name": "ethtest1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 9a787842-0db5-45cc-82f4-1fb96e28cf45 [006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, af2476db-1e3b-4f5e-ab84-23db91da8d4b [007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 9a787842-0db5-45cc-82f4-1fb96e28cf45 (not-active) [008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, af2476db-1e3b-4f5e-ab84-23db91da8d4b (not-active) 41016 1727204197.71617: no more pending results, returning what we have 41016 1727204197.71620: results queue empty 41016 1727204197.71621: checking for any_errors_fatal 41016 1727204197.71631: done checking for any_errors_fatal 41016 1727204197.71632: checking for max_fail_percentage 41016 1727204197.71633: done checking for max_fail_percentage 41016 1727204197.71634: checking to see if all hosts have failed and the running result is not ok 41016 1727204197.71635: done checking to see if all hosts have failed 41016 1727204197.71635: getting the remaining hosts for this loop 41016 1727204197.71637: done getting the remaining hosts for this loop 41016 1727204197.71641: getting the next task for host managed-node1 41016 1727204197.71646: done getting next task for host managed-node1 41016 1727204197.71650: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 41016 1727204197.71652: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204197.71661: getting variables 41016 1727204197.71663: in VariableManager get_vars() 41016 1727204197.71706: Calling all_inventory to load vars for managed-node1 41016 1727204197.71712: Calling groups_inventory to load vars for managed-node1 41016 1727204197.71714: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204197.71720: done sending task result for task 028d2410-947f-12d5-0ec4-00000000002b 41016 1727204197.71723: WORKER PROCESS EXITING 41016 1727204197.71735: Calling all_plugins_play to load vars for managed-node1 41016 1727204197.71737: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204197.71740: Calling groups_plugins_play to load vars for managed-node1 41016 1727204197.72681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204197.73540: done with get_vars() 41016 1727204197.73556: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:56:37 -0400 (0:00:01.386) 0:00:21.412 ***** 41016 1727204197.73618: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_state 41016 1727204197.73620: Creating lock for fedora.linux_system_roles.network_state 41016 1727204197.73866: worker is 1 (out of 1 available) 41016 1727204197.73879: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_state 41016 1727204197.73891: done queuing things up, now waiting for results queue to drain 41016 1727204197.73892: waiting for pending results... 41016 1727204197.74078: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state 41016 1727204197.74170: in run() - task 028d2410-947f-12d5-0ec4-00000000002c 41016 1727204197.74183: variable 'ansible_search_path' from source: unknown 41016 1727204197.74187: variable 'ansible_search_path' from source: unknown 41016 1727204197.74218: calling self._execute() 41016 1727204197.74288: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204197.74292: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204197.74300: variable 'omit' from source: magic vars 41016 1727204197.74578: variable 'ansible_distribution_major_version' from source: facts 41016 1727204197.74587: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204197.74669: variable 'network_state' from source: role '' defaults 41016 1727204197.74678: Evaluated conditional (network_state != {}): False 41016 1727204197.74681: when evaluation is False, skipping this task 41016 1727204197.74684: _execute() done 41016 1727204197.74686: dumping result to json 41016 1727204197.74691: done dumping result, returning 41016 1727204197.74697: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state [028d2410-947f-12d5-0ec4-00000000002c] 41016 1727204197.74702: sending task result for task 028d2410-947f-12d5-0ec4-00000000002c 41016 1727204197.74786: done sending task result for task 028d2410-947f-12d5-0ec4-00000000002c 41016 1727204197.74788: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41016 1727204197.74842: no more pending results, returning what we have 41016 1727204197.74846: results queue empty 41016 1727204197.74847: checking for any_errors_fatal 41016 1727204197.74866: done checking for any_errors_fatal 41016 1727204197.74867: checking for max_fail_percentage 41016 1727204197.74868: done checking for max_fail_percentage 41016 1727204197.74869: checking to see if all hosts have failed and the running result is not ok 41016 1727204197.74870: done checking to see if all hosts have failed 41016 1727204197.74870: getting the remaining hosts for this loop 41016 1727204197.74872: done getting the remaining hosts for this loop 41016 1727204197.74878: getting the next task for host managed-node1 41016 1727204197.74884: done getting next task for host managed-node1 41016 1727204197.74888: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 41016 1727204197.74891: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204197.74913: getting variables 41016 1727204197.74915: in VariableManager get_vars() 41016 1727204197.74947: Calling all_inventory to load vars for managed-node1 41016 1727204197.74949: Calling groups_inventory to load vars for managed-node1 41016 1727204197.74952: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204197.74959: Calling all_plugins_play to load vars for managed-node1 41016 1727204197.74962: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204197.74964: Calling groups_plugins_play to load vars for managed-node1 41016 1727204197.75732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204197.76591: done with get_vars() 41016 1727204197.76606: done getting variables 41016 1727204197.76649: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:56:37 -0400 (0:00:00.030) 0:00:21.442 ***** 41016 1727204197.76671: entering _queue_task() for managed-node1/debug 41016 1727204197.76881: worker is 1 (out of 1 available) 41016 1727204197.76894: exiting _queue_task() for managed-node1/debug 41016 1727204197.76906: done queuing things up, now waiting for results queue to drain 41016 1727204197.76907: waiting for pending results... 41016 1727204197.77087: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 41016 1727204197.77179: in run() - task 028d2410-947f-12d5-0ec4-00000000002d 41016 1727204197.77191: variable 'ansible_search_path' from source: unknown 41016 1727204197.77194: variable 'ansible_search_path' from source: unknown 41016 1727204197.77224: calling self._execute() 41016 1727204197.77297: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204197.77301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204197.77310: variable 'omit' from source: magic vars 41016 1727204197.77586: variable 'ansible_distribution_major_version' from source: facts 41016 1727204197.77596: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204197.77602: variable 'omit' from source: magic vars 41016 1727204197.77642: variable 'omit' from source: magic vars 41016 1727204197.77666: variable 'omit' from source: magic vars 41016 1727204197.77702: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204197.77729: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204197.77744: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204197.77757: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204197.77767: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204197.77794: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204197.77798: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204197.77800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204197.77868: Set connection var ansible_shell_executable to /bin/sh 41016 1727204197.77871: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204197.77878: Set connection var ansible_shell_type to sh 41016 1727204197.77903: Set connection var ansible_timeout to 10 41016 1727204197.77906: Set connection var ansible_pipelining to False 41016 1727204197.77909: Set connection var ansible_connection to ssh 41016 1727204197.77911: variable 'ansible_shell_executable' from source: unknown 41016 1727204197.77915: variable 'ansible_connection' from source: unknown 41016 1727204197.77918: variable 'ansible_module_compression' from source: unknown 41016 1727204197.77920: variable 'ansible_shell_type' from source: unknown 41016 1727204197.77923: variable 'ansible_shell_executable' from source: unknown 41016 1727204197.77926: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204197.77930: variable 'ansible_pipelining' from source: unknown 41016 1727204197.77933: variable 'ansible_timeout' from source: unknown 41016 1727204197.77937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204197.78037: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204197.78045: variable 'omit' from source: magic vars 41016 1727204197.78050: starting attempt loop 41016 1727204197.78053: running the handler 41016 1727204197.78145: variable '__network_connections_result' from source: set_fact 41016 1727204197.78196: handler run complete 41016 1727204197.78209: attempt loop complete, returning result 41016 1727204197.78211: _execute() done 41016 1727204197.78217: dumping result to json 41016 1727204197.78221: done dumping result, returning 41016 1727204197.78232: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [028d2410-947f-12d5-0ec4-00000000002d] 41016 1727204197.78235: sending task result for task 028d2410-947f-12d5-0ec4-00000000002d 41016 1727204197.78311: done sending task result for task 028d2410-947f-12d5-0ec4-00000000002d 41016 1727204197.78314: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result.stderr_lines": [ "[005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 9a787842-0db5-45cc-82f4-1fb96e28cf45", "[006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, af2476db-1e3b-4f5e-ab84-23db91da8d4b", "[007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 9a787842-0db5-45cc-82f4-1fb96e28cf45 (not-active)", "[008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, af2476db-1e3b-4f5e-ab84-23db91da8d4b (not-active)" ] } 41016 1727204197.78396: no more pending results, returning what we have 41016 1727204197.78399: results queue empty 41016 1727204197.78400: checking for any_errors_fatal 41016 1727204197.78406: done checking for any_errors_fatal 41016 1727204197.78407: checking for max_fail_percentage 41016 1727204197.78408: done checking for max_fail_percentage 41016 1727204197.78409: checking to see if all hosts have failed and the running result is not ok 41016 1727204197.78410: done checking to see if all hosts have failed 41016 1727204197.78410: getting the remaining hosts for this loop 41016 1727204197.78412: done getting the remaining hosts for this loop 41016 1727204197.78415: getting the next task for host managed-node1 41016 1727204197.78421: done getting next task for host managed-node1 41016 1727204197.78425: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 41016 1727204197.78427: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204197.78437: getting variables 41016 1727204197.78440: in VariableManager get_vars() 41016 1727204197.78472: Calling all_inventory to load vars for managed-node1 41016 1727204197.78474: Calling groups_inventory to load vars for managed-node1 41016 1727204197.78477: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204197.78486: Calling all_plugins_play to load vars for managed-node1 41016 1727204197.78488: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204197.78490: Calling groups_plugins_play to load vars for managed-node1 41016 1727204197.79350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204197.80198: done with get_vars() 41016 1727204197.80213: done getting variables 41016 1727204197.80251: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:56:37 -0400 (0:00:00.036) 0:00:21.478 ***** 41016 1727204197.80280: entering _queue_task() for managed-node1/debug 41016 1727204197.80492: worker is 1 (out of 1 available) 41016 1727204197.80505: exiting _queue_task() for managed-node1/debug 41016 1727204197.80515: done queuing things up, now waiting for results queue to drain 41016 1727204197.80517: waiting for pending results... 41016 1727204197.80693: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 41016 1727204197.80775: in run() - task 028d2410-947f-12d5-0ec4-00000000002e 41016 1727204197.80788: variable 'ansible_search_path' from source: unknown 41016 1727204197.80792: variable 'ansible_search_path' from source: unknown 41016 1727204197.80821: calling self._execute() 41016 1727204197.80890: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204197.80894: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204197.80902: variable 'omit' from source: magic vars 41016 1727204197.81172: variable 'ansible_distribution_major_version' from source: facts 41016 1727204197.81183: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204197.81194: variable 'omit' from source: magic vars 41016 1727204197.81234: variable 'omit' from source: magic vars 41016 1727204197.81258: variable 'omit' from source: magic vars 41016 1727204197.81296: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204197.81322: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204197.81336: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204197.81349: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204197.81358: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204197.81382: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204197.81385: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204197.81387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204197.81454: Set connection var ansible_shell_executable to /bin/sh 41016 1727204197.81457: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204197.81463: Set connection var ansible_shell_type to sh 41016 1727204197.81468: Set connection var ansible_timeout to 10 41016 1727204197.81473: Set connection var ansible_pipelining to False 41016 1727204197.81482: Set connection var ansible_connection to ssh 41016 1727204197.81497: variable 'ansible_shell_executable' from source: unknown 41016 1727204197.81500: variable 'ansible_connection' from source: unknown 41016 1727204197.81506: variable 'ansible_module_compression' from source: unknown 41016 1727204197.81508: variable 'ansible_shell_type' from source: unknown 41016 1727204197.81513: variable 'ansible_shell_executable' from source: unknown 41016 1727204197.81515: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204197.81517: variable 'ansible_pipelining' from source: unknown 41016 1727204197.81519: variable 'ansible_timeout' from source: unknown 41016 1727204197.81521: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204197.81618: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204197.81625: variable 'omit' from source: magic vars 41016 1727204197.81638: starting attempt loop 41016 1727204197.81641: running the handler 41016 1727204197.81673: variable '__network_connections_result' from source: set_fact 41016 1727204197.81728: variable '__network_connections_result' from source: set_fact 41016 1727204197.81858: handler run complete 41016 1727204197.81881: attempt loop complete, returning result 41016 1727204197.81884: _execute() done 41016 1727204197.81887: dumping result to json 41016 1727204197.81893: done dumping result, returning 41016 1727204197.81900: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [028d2410-947f-12d5-0ec4-00000000002e] 41016 1727204197.81903: sending task result for task 028d2410-947f-12d5-0ec4-00000000002e 41016 1727204197.82000: done sending task result for task 028d2410-947f-12d5-0ec4-00000000002e 41016 1727204197.82003: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/24", "2001:db8::2/32" ], "route": [ { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.10.64", "prefix": 26 }, { "gateway": "2001:db8::1", "metric": 2, "network": "2001:db6::4", "prefix": 128 } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" }, { "autoconnect": false, "interface_name": "ethtest1", "ip": { "address": [ "198.51.100.6/24", "2001:db8::4/32" ], "route": [ { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.12.128", "prefix": 26 } ] }, "name": "ethtest1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 9a787842-0db5-45cc-82f4-1fb96e28cf45\n[006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, af2476db-1e3b-4f5e-ab84-23db91da8d4b\n[007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 9a787842-0db5-45cc-82f4-1fb96e28cf45 (not-active)\n[008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, af2476db-1e3b-4f5e-ab84-23db91da8d4b (not-active)\n", "stderr_lines": [ "[005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 9a787842-0db5-45cc-82f4-1fb96e28cf45", "[006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, af2476db-1e3b-4f5e-ab84-23db91da8d4b", "[007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 9a787842-0db5-45cc-82f4-1fb96e28cf45 (not-active)", "[008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, af2476db-1e3b-4f5e-ab84-23db91da8d4b (not-active)" ] } } 41016 1727204197.82125: no more pending results, returning what we have 41016 1727204197.82128: results queue empty 41016 1727204197.82129: checking for any_errors_fatal 41016 1727204197.82133: done checking for any_errors_fatal 41016 1727204197.82134: checking for max_fail_percentage 41016 1727204197.82135: done checking for max_fail_percentage 41016 1727204197.82136: checking to see if all hosts have failed and the running result is not ok 41016 1727204197.82137: done checking to see if all hosts have failed 41016 1727204197.82137: getting the remaining hosts for this loop 41016 1727204197.82138: done getting the remaining hosts for this loop 41016 1727204197.82141: getting the next task for host managed-node1 41016 1727204197.82147: done getting next task for host managed-node1 41016 1727204197.82150: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 41016 1727204197.82153: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204197.82161: getting variables 41016 1727204197.82162: in VariableManager get_vars() 41016 1727204197.82200: Calling all_inventory to load vars for managed-node1 41016 1727204197.82202: Calling groups_inventory to load vars for managed-node1 41016 1727204197.82204: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204197.82212: Calling all_plugins_play to load vars for managed-node1 41016 1727204197.82213: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204197.82215: Calling groups_plugins_play to load vars for managed-node1 41016 1727204197.82970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204197.83837: done with get_vars() 41016 1727204197.83853: done getting variables 41016 1727204197.83894: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:56:37 -0400 (0:00:00.036) 0:00:21.515 ***** 41016 1727204197.83921: entering _queue_task() for managed-node1/debug 41016 1727204197.84146: worker is 1 (out of 1 available) 41016 1727204197.84160: exiting _queue_task() for managed-node1/debug 41016 1727204197.84172: done queuing things up, now waiting for results queue to drain 41016 1727204197.84177: waiting for pending results... 41016 1727204197.84346: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 41016 1727204197.84427: in run() - task 028d2410-947f-12d5-0ec4-00000000002f 41016 1727204197.84438: variable 'ansible_search_path' from source: unknown 41016 1727204197.84443: variable 'ansible_search_path' from source: unknown 41016 1727204197.84468: calling self._execute() 41016 1727204197.84543: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204197.84547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204197.84556: variable 'omit' from source: magic vars 41016 1727204197.84848: variable 'ansible_distribution_major_version' from source: facts 41016 1727204197.84857: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204197.84939: variable 'network_state' from source: role '' defaults 41016 1727204197.84953: Evaluated conditional (network_state != {}): False 41016 1727204197.84956: when evaluation is False, skipping this task 41016 1727204197.84959: _execute() done 41016 1727204197.84961: dumping result to json 41016 1727204197.84964: done dumping result, returning 41016 1727204197.84969: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [028d2410-947f-12d5-0ec4-00000000002f] 41016 1727204197.84974: sending task result for task 028d2410-947f-12d5-0ec4-00000000002f 41016 1727204197.85059: done sending task result for task 028d2410-947f-12d5-0ec4-00000000002f 41016 1727204197.85062: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "network_state != {}" } 41016 1727204197.85115: no more pending results, returning what we have 41016 1727204197.85119: results queue empty 41016 1727204197.85121: checking for any_errors_fatal 41016 1727204197.85134: done checking for any_errors_fatal 41016 1727204197.85134: checking for max_fail_percentage 41016 1727204197.85136: done checking for max_fail_percentage 41016 1727204197.85137: checking to see if all hosts have failed and the running result is not ok 41016 1727204197.85137: done checking to see if all hosts have failed 41016 1727204197.85138: getting the remaining hosts for this loop 41016 1727204197.85139: done getting the remaining hosts for this loop 41016 1727204197.85143: getting the next task for host managed-node1 41016 1727204197.85149: done getting next task for host managed-node1 41016 1727204197.85153: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 41016 1727204197.85155: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204197.85168: getting variables 41016 1727204197.85169: in VariableManager get_vars() 41016 1727204197.85204: Calling all_inventory to load vars for managed-node1 41016 1727204197.85206: Calling groups_inventory to load vars for managed-node1 41016 1727204197.85208: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204197.85218: Calling all_plugins_play to load vars for managed-node1 41016 1727204197.85220: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204197.85222: Calling groups_plugins_play to load vars for managed-node1 41016 1727204197.86134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204197.87500: done with get_vars() 41016 1727204197.87519: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:56:37 -0400 (0:00:00.036) 0:00:21.551 ***** 41016 1727204197.87587: entering _queue_task() for managed-node1/ping 41016 1727204197.87588: Creating lock for ping 41016 1727204197.87829: worker is 1 (out of 1 available) 41016 1727204197.87844: exiting _queue_task() for managed-node1/ping 41016 1727204197.87855: done queuing things up, now waiting for results queue to drain 41016 1727204197.87856: waiting for pending results... 41016 1727204197.88030: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 41016 1727204197.88117: in run() - task 028d2410-947f-12d5-0ec4-000000000030 41016 1727204197.88127: variable 'ansible_search_path' from source: unknown 41016 1727204197.88130: variable 'ansible_search_path' from source: unknown 41016 1727204197.88157: calling self._execute() 41016 1727204197.88226: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204197.88230: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204197.88238: variable 'omit' from source: magic vars 41016 1727204197.88514: variable 'ansible_distribution_major_version' from source: facts 41016 1727204197.88527: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204197.88530: variable 'omit' from source: magic vars 41016 1727204197.88561: variable 'omit' from source: magic vars 41016 1727204197.88586: variable 'omit' from source: magic vars 41016 1727204197.88618: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204197.88646: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204197.88660: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204197.88673: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204197.88685: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204197.88708: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204197.88713: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204197.88715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204197.88785: Set connection var ansible_shell_executable to /bin/sh 41016 1727204197.88790: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204197.88795: Set connection var ansible_shell_type to sh 41016 1727204197.88800: Set connection var ansible_timeout to 10 41016 1727204197.88805: Set connection var ansible_pipelining to False 41016 1727204197.88814: Set connection var ansible_connection to ssh 41016 1727204197.88871: variable 'ansible_shell_executable' from source: unknown 41016 1727204197.88874: variable 'ansible_connection' from source: unknown 41016 1727204197.88879: variable 'ansible_module_compression' from source: unknown 41016 1727204197.88881: variable 'ansible_shell_type' from source: unknown 41016 1727204197.88883: variable 'ansible_shell_executable' from source: unknown 41016 1727204197.88885: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204197.88887: variable 'ansible_pipelining' from source: unknown 41016 1727204197.88888: variable 'ansible_timeout' from source: unknown 41016 1727204197.88890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204197.89268: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 41016 1727204197.89272: variable 'omit' from source: magic vars 41016 1727204197.89274: starting attempt loop 41016 1727204197.89279: running the handler 41016 1727204197.89280: _low_level_execute_command(): starting 41016 1727204197.89282: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204197.89856: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204197.89869: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204197.89887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204197.89906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204197.89996: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204197.90025: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204197.90042: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204197.90068: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204197.90186: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204197.91985: stdout chunk (state=3): >>>/root <<< 41016 1727204197.92130: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204197.92145: stdout chunk (state=3): >>><<< 41016 1727204197.92162: stderr chunk (state=3): >>><<< 41016 1727204197.92197: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204197.92218: _low_level_execute_command(): starting 41016 1727204197.92230: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204197.9220445-42743-266867206296386 `" && echo ansible-tmp-1727204197.9220445-42743-266867206296386="` echo /root/.ansible/tmp/ansible-tmp-1727204197.9220445-42743-266867206296386 `" ) && sleep 0' 41016 1727204197.92923: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204197.92998: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204197.93070: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204197.93093: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204197.93119: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204197.93244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204197.95414: stdout chunk (state=3): >>>ansible-tmp-1727204197.9220445-42743-266867206296386=/root/.ansible/tmp/ansible-tmp-1727204197.9220445-42743-266867206296386 <<< 41016 1727204197.95519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204197.95782: stderr chunk (state=3): >>><<< 41016 1727204197.95785: stdout chunk (state=3): >>><<< 41016 1727204197.95788: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204197.9220445-42743-266867206296386=/root/.ansible/tmp/ansible-tmp-1727204197.9220445-42743-266867206296386 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204197.95790: variable 'ansible_module_compression' from source: unknown 41016 1727204197.95792: ANSIBALLZ: Using lock for ping 41016 1727204197.95794: ANSIBALLZ: Acquiring lock 41016 1727204197.95796: ANSIBALLZ: Lock acquired: 140580604787168 41016 1727204197.95798: ANSIBALLZ: Creating module 41016 1727204198.08998: ANSIBALLZ: Writing module into payload 41016 1727204198.09065: ANSIBALLZ: Writing module 41016 1727204198.09093: ANSIBALLZ: Renaming module 41016 1727204198.09108: ANSIBALLZ: Done creating module 41016 1727204198.09130: variable 'ansible_facts' from source: unknown 41016 1727204198.09237: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204197.9220445-42743-266867206296386/AnsiballZ_ping.py 41016 1727204198.09391: Sending initial data 41016 1727204198.09523: Sent initial data (153 bytes) 41016 1727204198.10950: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204198.11155: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204198.11331: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204198.11468: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204198.13231: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 41016 1727204198.13257: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204198.13347: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204198.13457: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmp793qepje /root/.ansible/tmp/ansible-tmp-1727204197.9220445-42743-266867206296386/AnsiballZ_ping.py <<< 41016 1727204198.13461: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204197.9220445-42743-266867206296386/AnsiballZ_ping.py" <<< 41016 1727204198.13538: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmp793qepje" to remote "/root/.ansible/tmp/ansible-tmp-1727204197.9220445-42743-266867206296386/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204197.9220445-42743-266867206296386/AnsiballZ_ping.py" <<< 41016 1727204198.14633: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204198.14636: stdout chunk (state=3): >>><<< 41016 1727204198.14639: stderr chunk (state=3): >>><<< 41016 1727204198.14641: done transferring module to remote 41016 1727204198.14643: _low_level_execute_command(): starting 41016 1727204198.14646: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204197.9220445-42743-266867206296386/ /root/.ansible/tmp/ansible-tmp-1727204197.9220445-42743-266867206296386/AnsiballZ_ping.py && sleep 0' 41016 1727204198.15356: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204198.15365: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204198.15377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204198.15399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204198.15413: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204198.15417: stderr chunk (state=3): >>>debug2: match not found <<< 41016 1727204198.15427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204198.15442: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41016 1727204198.15449: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 41016 1727204198.15460: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41016 1727204198.15464: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204198.15628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204198.15631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204198.15638: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204198.15640: stderr chunk (state=3): >>>debug2: match found <<< 41016 1727204198.15642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204198.15644: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204198.15646: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204198.15754: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204198.15867: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204198.17923: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204198.17995: stderr chunk (state=3): >>><<< 41016 1727204198.18005: stdout chunk (state=3): >>><<< 41016 1727204198.18087: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204198.18101: _low_level_execute_command(): starting 41016 1727204198.18115: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204197.9220445-42743-266867206296386/AnsiballZ_ping.py && sleep 0' 41016 1727204198.19426: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204198.19633: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204198.19636: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204198.19674: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204198.19852: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204198.36149: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 41016 1727204198.37627: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 41016 1727204198.37666: stderr chunk (state=3): >>><<< 41016 1727204198.37684: stdout chunk (state=3): >>><<< 41016 1727204198.37785: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204198.37789: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204197.9220445-42743-266867206296386/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204198.37792: _low_level_execute_command(): starting 41016 1727204198.37794: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204197.9220445-42743-266867206296386/ > /dev/null 2>&1 && sleep 0' 41016 1727204198.38330: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204198.38346: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204198.38360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204198.38382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204198.38401: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204198.38496: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204198.38522: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204198.38631: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204198.40779: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204198.40826: stderr chunk (state=3): >>><<< 41016 1727204198.40834: stdout chunk (state=3): >>><<< 41016 1727204198.40852: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204198.40864: handler run complete 41016 1727204198.40887: attempt loop complete, returning result 41016 1727204198.40895: _execute() done 41016 1727204198.40901: dumping result to json 41016 1727204198.40909: done dumping result, returning 41016 1727204198.40922: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [028d2410-947f-12d5-0ec4-000000000030] 41016 1727204198.40931: sending task result for task 028d2410-947f-12d5-0ec4-000000000030 ok: [managed-node1] => { "changed": false, "ping": "pong" } 41016 1727204198.41139: no more pending results, returning what we have 41016 1727204198.41143: results queue empty 41016 1727204198.41144: checking for any_errors_fatal 41016 1727204198.41151: done checking for any_errors_fatal 41016 1727204198.41152: checking for max_fail_percentage 41016 1727204198.41154: done checking for max_fail_percentage 41016 1727204198.41155: checking to see if all hosts have failed and the running result is not ok 41016 1727204198.41156: done checking to see if all hosts have failed 41016 1727204198.41156: getting the remaining hosts for this loop 41016 1727204198.41158: done getting the remaining hosts for this loop 41016 1727204198.41162: getting the next task for host managed-node1 41016 1727204198.41173: done getting next task for host managed-node1 41016 1727204198.41178: ^ task is: TASK: meta (role_complete) 41016 1727204198.41182: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204198.41194: getting variables 41016 1727204198.41196: in VariableManager get_vars() 41016 1727204198.41241: Calling all_inventory to load vars for managed-node1 41016 1727204198.41244: Calling groups_inventory to load vars for managed-node1 41016 1727204198.41247: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204198.41257: Calling all_plugins_play to load vars for managed-node1 41016 1727204198.41261: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204198.41264: Calling groups_plugins_play to load vars for managed-node1 41016 1727204198.41989: done sending task result for task 028d2410-947f-12d5-0ec4-000000000030 41016 1727204198.41992: WORKER PROCESS EXITING 41016 1727204198.42909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204198.44544: done with get_vars() 41016 1727204198.44564: done getting variables 41016 1727204198.44762: done queuing things up, now waiting for results queue to drain 41016 1727204198.44764: results queue empty 41016 1727204198.44765: checking for any_errors_fatal 41016 1727204198.44770: done checking for any_errors_fatal 41016 1727204198.44771: checking for max_fail_percentage 41016 1727204198.44772: done checking for max_fail_percentage 41016 1727204198.44773: checking to see if all hosts have failed and the running result is not ok 41016 1727204198.44774: done checking to see if all hosts have failed 41016 1727204198.44774: getting the remaining hosts for this loop 41016 1727204198.44781: done getting the remaining hosts for this loop 41016 1727204198.44784: getting the next task for host managed-node1 41016 1727204198.44788: done getting next task for host managed-node1 41016 1727204198.44791: ^ task is: TASK: Get the IPv4 routes from the route table main 41016 1727204198.44792: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204198.44795: getting variables 41016 1727204198.44796: in VariableManager get_vars() 41016 1727204198.44813: Calling all_inventory to load vars for managed-node1 41016 1727204198.44816: Calling groups_inventory to load vars for managed-node1 41016 1727204198.44818: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204198.44824: Calling all_plugins_play to load vars for managed-node1 41016 1727204198.44826: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204198.44829: Calling groups_plugins_play to load vars for managed-node1 41016 1727204198.46141: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204198.47769: done with get_vars() 41016 1727204198.47799: done getting variables 41016 1727204198.47853: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the IPv4 routes from the route table main] *************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:73 Tuesday 24 September 2024 14:56:38 -0400 (0:00:00.602) 0:00:22.154 ***** 41016 1727204198.47896: entering _queue_task() for managed-node1/command 41016 1727204198.48362: worker is 1 (out of 1 available) 41016 1727204198.48378: exiting _queue_task() for managed-node1/command 41016 1727204198.48392: done queuing things up, now waiting for results queue to drain 41016 1727204198.48393: waiting for pending results... 41016 1727204198.48664: running TaskExecutor() for managed-node1/TASK: Get the IPv4 routes from the route table main 41016 1727204198.48777: in run() - task 028d2410-947f-12d5-0ec4-000000000060 41016 1727204198.48801: variable 'ansible_search_path' from source: unknown 41016 1727204198.48840: calling self._execute() 41016 1727204198.48945: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204198.48956: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204198.48970: variable 'omit' from source: magic vars 41016 1727204198.49358: variable 'ansible_distribution_major_version' from source: facts 41016 1727204198.49374: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204198.49388: variable 'omit' from source: magic vars 41016 1727204198.49413: variable 'omit' from source: magic vars 41016 1727204198.49457: variable 'omit' from source: magic vars 41016 1727204198.49506: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204198.49549: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204198.49573: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204198.49596: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204198.49613: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204198.49651: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204198.49760: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204198.49764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204198.49772: Set connection var ansible_shell_executable to /bin/sh 41016 1727204198.49787: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204198.49797: Set connection var ansible_shell_type to sh 41016 1727204198.49805: Set connection var ansible_timeout to 10 41016 1727204198.49815: Set connection var ansible_pipelining to False 41016 1727204198.49827: Set connection var ansible_connection to ssh 41016 1727204198.49852: variable 'ansible_shell_executable' from source: unknown 41016 1727204198.49861: variable 'ansible_connection' from source: unknown 41016 1727204198.49876: variable 'ansible_module_compression' from source: unknown 41016 1727204198.49886: variable 'ansible_shell_type' from source: unknown 41016 1727204198.49893: variable 'ansible_shell_executable' from source: unknown 41016 1727204198.49899: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204198.49908: variable 'ansible_pipelining' from source: unknown 41016 1727204198.49915: variable 'ansible_timeout' from source: unknown 41016 1727204198.49922: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204198.50060: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204198.50077: variable 'omit' from source: magic vars 41016 1727204198.50095: starting attempt loop 41016 1727204198.50102: running the handler 41016 1727204198.50122: _low_level_execute_command(): starting 41016 1727204198.50136: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204198.50817: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204198.50822: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 41016 1727204198.50825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204198.50866: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204198.50869: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204198.50958: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204198.52882: stdout chunk (state=3): >>>/root <<< 41016 1727204198.52893: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204198.52901: stdout chunk (state=3): >>><<< 41016 1727204198.52910: stderr chunk (state=3): >>><<< 41016 1727204198.52939: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204198.52954: _low_level_execute_command(): starting 41016 1727204198.52959: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204198.5293832-42765-31203762543512 `" && echo ansible-tmp-1727204198.5293832-42765-31203762543512="` echo /root/.ansible/tmp/ansible-tmp-1727204198.5293832-42765-31203762543512 `" ) && sleep 0' 41016 1727204198.53536: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204198.53577: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204198.53581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204198.53584: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204198.53594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204198.53635: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204198.53650: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204198.53736: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204198.55861: stdout chunk (state=3): >>>ansible-tmp-1727204198.5293832-42765-31203762543512=/root/.ansible/tmp/ansible-tmp-1727204198.5293832-42765-31203762543512 <<< 41016 1727204198.56025: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204198.56028: stdout chunk (state=3): >>><<< 41016 1727204198.56032: stderr chunk (state=3): >>><<< 41016 1727204198.56181: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204198.5293832-42765-31203762543512=/root/.ansible/tmp/ansible-tmp-1727204198.5293832-42765-31203762543512 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204198.56184: variable 'ansible_module_compression' from source: unknown 41016 1727204198.56186: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-410168h8uvyln/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41016 1727204198.56188: variable 'ansible_facts' from source: unknown 41016 1727204198.56279: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204198.5293832-42765-31203762543512/AnsiballZ_command.py 41016 1727204198.56433: Sending initial data 41016 1727204198.56448: Sent initial data (155 bytes) 41016 1727204198.56889: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204198.56901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204198.56913: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204198.56966: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204198.56983: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204198.57054: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204198.58827: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 41016 1727204198.58831: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204198.58895: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204198.58974: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmp85lqf78b /root/.ansible/tmp/ansible-tmp-1727204198.5293832-42765-31203762543512/AnsiballZ_command.py <<< 41016 1727204198.58980: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204198.5293832-42765-31203762543512/AnsiballZ_command.py" <<< 41016 1727204198.59046: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmp85lqf78b" to remote "/root/.ansible/tmp/ansible-tmp-1727204198.5293832-42765-31203762543512/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204198.5293832-42765-31203762543512/AnsiballZ_command.py" <<< 41016 1727204198.60039: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204198.60043: stdout chunk (state=3): >>><<< 41016 1727204198.60046: stderr chunk (state=3): >>><<< 41016 1727204198.60053: done transferring module to remote 41016 1727204198.60063: _low_level_execute_command(): starting 41016 1727204198.60067: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204198.5293832-42765-31203762543512/ /root/.ansible/tmp/ansible-tmp-1727204198.5293832-42765-31203762543512/AnsiballZ_command.py && sleep 0' 41016 1727204198.60581: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 41016 1727204198.60584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204198.60587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204198.60631: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204198.60805: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204198.62782: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204198.62981: stdout chunk (state=3): >>><<< 41016 1727204198.62984: stderr chunk (state=3): >>><<< 41016 1727204198.62987: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204198.62990: _low_level_execute_command(): starting 41016 1727204198.62992: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204198.5293832-42765-31203762543512/AnsiballZ_command.py && sleep 0' 41016 1727204198.64017: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204198.64195: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204198.64214: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204198.64327: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204198.81234: stdout chunk (state=3): >>> {"changed": true, "stdout": "default via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.47 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.47 metric 100 \n198.51.10.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 \n198.51.12.128/26 via 198.51.100.1 dev ethtest1 proto static metric 2 \n198.51.100.0/24 dev ethtest0 proto kernel scope link src 198.51.100.3 metric 103 \n198.51.100.0/24 dev ethtest1 proto kernel scope link src 198.51.100.6 metric 104 ", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "route"], "start": "2024-09-24 14:56:38.806009", "end": "2024-09-24 14:56:38.810451", "delta": "0:00:00.004442", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41016 1727204198.83159: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 41016 1727204198.83169: stdout chunk (state=3): >>><<< 41016 1727204198.83218: stderr chunk (state=3): >>><<< 41016 1727204198.83323: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "default via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.47 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.47 metric 100 \n198.51.10.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 \n198.51.12.128/26 via 198.51.100.1 dev ethtest1 proto static metric 2 \n198.51.100.0/24 dev ethtest0 proto kernel scope link src 198.51.100.3 metric 103 \n198.51.100.0/24 dev ethtest1 proto kernel scope link src 198.51.100.6 metric 104 ", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "route"], "start": "2024-09-24 14:56:38.806009", "end": "2024-09-24 14:56:38.810451", "delta": "0:00:00.004442", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204198.83328: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204198.5293832-42765-31203762543512/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204198.83330: _low_level_execute_command(): starting 41016 1727204198.83332: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204198.5293832-42765-31203762543512/ > /dev/null 2>&1 && sleep 0' 41016 1727204198.84064: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204198.84100: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204198.84216: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204198.86220: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204198.86223: stdout chunk (state=3): >>><<< 41016 1727204198.86226: stderr chunk (state=3): >>><<< 41016 1727204198.86241: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204198.86251: handler run complete 41016 1727204198.86585: Evaluated conditional (False): False 41016 1727204198.86588: attempt loop complete, returning result 41016 1727204198.86591: _execute() done 41016 1727204198.86593: dumping result to json 41016 1727204198.86595: done dumping result, returning 41016 1727204198.86597: done running TaskExecutor() for managed-node1/TASK: Get the IPv4 routes from the route table main [028d2410-947f-12d5-0ec4-000000000060] 41016 1727204198.86599: sending task result for task 028d2410-947f-12d5-0ec4-000000000060 41016 1727204198.86672: done sending task result for task 028d2410-947f-12d5-0ec4-000000000060 41016 1727204198.86677: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "ip", "-4", "route" ], "delta": "0:00:00.004442", "end": "2024-09-24 14:56:38.810451", "rc": 0, "start": "2024-09-24 14:56:38.806009" } STDOUT: default via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.47 metric 100 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.47 metric 100 198.51.10.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 198.51.12.128/26 via 198.51.100.1 dev ethtest1 proto static metric 2 198.51.100.0/24 dev ethtest0 proto kernel scope link src 198.51.100.3 metric 103 198.51.100.0/24 dev ethtest1 proto kernel scope link src 198.51.100.6 metric 104 41016 1727204198.86958: no more pending results, returning what we have 41016 1727204198.86961: results queue empty 41016 1727204198.86962: checking for any_errors_fatal 41016 1727204198.86963: done checking for any_errors_fatal 41016 1727204198.86964: checking for max_fail_percentage 41016 1727204198.86965: done checking for max_fail_percentage 41016 1727204198.86966: checking to see if all hosts have failed and the running result is not ok 41016 1727204198.86967: done checking to see if all hosts have failed 41016 1727204198.86968: getting the remaining hosts for this loop 41016 1727204198.86969: done getting the remaining hosts for this loop 41016 1727204198.86972: getting the next task for host managed-node1 41016 1727204198.86981: done getting next task for host managed-node1 41016 1727204198.86984: ^ task is: TASK: Assert that the route table main contains the specified IPv4 routes 41016 1727204198.86985: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204198.86988: getting variables 41016 1727204198.86990: in VariableManager get_vars() 41016 1727204198.87033: Calling all_inventory to load vars for managed-node1 41016 1727204198.87035: Calling groups_inventory to load vars for managed-node1 41016 1727204198.87038: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204198.87048: Calling all_plugins_play to load vars for managed-node1 41016 1727204198.87050: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204198.87052: Calling groups_plugins_play to load vars for managed-node1 41016 1727204198.95480: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204198.97289: done with get_vars() 41016 1727204198.97318: done getting variables 41016 1727204198.97363: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the route table main contains the specified IPv4 routes] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:78 Tuesday 24 September 2024 14:56:38 -0400 (0:00:00.495) 0:00:22.649 ***** 41016 1727204198.97388: entering _queue_task() for managed-node1/assert 41016 1727204198.97740: worker is 1 (out of 1 available) 41016 1727204198.97751: exiting _queue_task() for managed-node1/assert 41016 1727204198.97763: done queuing things up, now waiting for results queue to drain 41016 1727204198.97765: waiting for pending results... 41016 1727204198.98264: running TaskExecutor() for managed-node1/TASK: Assert that the route table main contains the specified IPv4 routes 41016 1727204198.98296: in run() - task 028d2410-947f-12d5-0ec4-000000000061 41016 1727204198.98323: variable 'ansible_search_path' from source: unknown 41016 1727204198.98365: calling self._execute() 41016 1727204198.98492: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204198.98599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204198.98603: variable 'omit' from source: magic vars 41016 1727204198.98935: variable 'ansible_distribution_major_version' from source: facts 41016 1727204198.98952: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204198.98964: variable 'omit' from source: magic vars 41016 1727204198.98994: variable 'omit' from source: magic vars 41016 1727204198.99045: variable 'omit' from source: magic vars 41016 1727204198.99093: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204198.99135: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204198.99165: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204198.99191: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204198.99207: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204198.99242: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204198.99255: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204198.99263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204198.99372: Set connection var ansible_shell_executable to /bin/sh 41016 1727204198.99388: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204198.99469: Set connection var ansible_shell_type to sh 41016 1727204198.99472: Set connection var ansible_timeout to 10 41016 1727204198.99476: Set connection var ansible_pipelining to False 41016 1727204198.99479: Set connection var ansible_connection to ssh 41016 1727204198.99481: variable 'ansible_shell_executable' from source: unknown 41016 1727204198.99483: variable 'ansible_connection' from source: unknown 41016 1727204198.99485: variable 'ansible_module_compression' from source: unknown 41016 1727204198.99487: variable 'ansible_shell_type' from source: unknown 41016 1727204198.99490: variable 'ansible_shell_executable' from source: unknown 41016 1727204198.99492: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204198.99494: variable 'ansible_pipelining' from source: unknown 41016 1727204198.99497: variable 'ansible_timeout' from source: unknown 41016 1727204198.99500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204198.99777: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204199.00183: variable 'omit' from source: magic vars 41016 1727204199.00187: starting attempt loop 41016 1727204199.00189: running the handler 41016 1727204199.00192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41016 1727204199.00788: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41016 1727204199.00884: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41016 1727204199.00943: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41016 1727204199.00984: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41016 1727204199.01105: variable 'route_table_main_ipv4' from source: set_fact 41016 1727204199.01152: Evaluated conditional (route_table_main_ipv4.stdout is search("198.51.10.64/26 via 198.51.100.6 dev ethtest0\s+(proto static )?metric 4")): True 41016 1727204199.01299: variable 'route_table_main_ipv4' from source: set_fact 41016 1727204199.01340: Evaluated conditional (route_table_main_ipv4.stdout is search("198.51.12.128/26 via 198.51.100.1 dev ethtest1\s+(proto static )?metric 2")): True 41016 1727204199.01352: handler run complete 41016 1727204199.01372: attempt loop complete, returning result 41016 1727204199.01382: _execute() done 41016 1727204199.01389: dumping result to json 41016 1727204199.01396: done dumping result, returning 41016 1727204199.01405: done running TaskExecutor() for managed-node1/TASK: Assert that the route table main contains the specified IPv4 routes [028d2410-947f-12d5-0ec4-000000000061] 41016 1727204199.01417: sending task result for task 028d2410-947f-12d5-0ec4-000000000061 ok: [managed-node1] => { "changed": false } MSG: All assertions passed 41016 1727204199.01628: no more pending results, returning what we have 41016 1727204199.01633: results queue empty 41016 1727204199.01634: checking for any_errors_fatal 41016 1727204199.01644: done checking for any_errors_fatal 41016 1727204199.01645: checking for max_fail_percentage 41016 1727204199.01647: done checking for max_fail_percentage 41016 1727204199.01648: checking to see if all hosts have failed and the running result is not ok 41016 1727204199.01648: done checking to see if all hosts have failed 41016 1727204199.01649: getting the remaining hosts for this loop 41016 1727204199.01651: done getting the remaining hosts for this loop 41016 1727204199.01654: getting the next task for host managed-node1 41016 1727204199.01660: done getting next task for host managed-node1 41016 1727204199.01664: ^ task is: TASK: Get the IPv6 routes from the route table main 41016 1727204199.01666: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204199.01669: getting variables 41016 1727204199.01671: in VariableManager get_vars() 41016 1727204199.01720: Calling all_inventory to load vars for managed-node1 41016 1727204199.01724: Calling groups_inventory to load vars for managed-node1 41016 1727204199.01727: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204199.01738: Calling all_plugins_play to load vars for managed-node1 41016 1727204199.01741: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204199.01744: Calling groups_plugins_play to load vars for managed-node1 41016 1727204199.02287: done sending task result for task 028d2410-947f-12d5-0ec4-000000000061 41016 1727204199.02290: WORKER PROCESS EXITING 41016 1727204199.02798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204199.04348: done with get_vars() 41016 1727204199.04370: done getting variables 41016 1727204199.04429: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the IPv6 routes from the route table main] *************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:89 Tuesday 24 September 2024 14:56:39 -0400 (0:00:00.070) 0:00:22.720 ***** 41016 1727204199.04463: entering _queue_task() for managed-node1/command 41016 1727204199.04788: worker is 1 (out of 1 available) 41016 1727204199.04800: exiting _queue_task() for managed-node1/command 41016 1727204199.04811: done queuing things up, now waiting for results queue to drain 41016 1727204199.04812: waiting for pending results... 41016 1727204199.05004: running TaskExecutor() for managed-node1/TASK: Get the IPv6 routes from the route table main 41016 1727204199.05074: in run() - task 028d2410-947f-12d5-0ec4-000000000062 41016 1727204199.05086: variable 'ansible_search_path' from source: unknown 41016 1727204199.05119: calling self._execute() 41016 1727204199.05204: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204199.05207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204199.05219: variable 'omit' from source: magic vars 41016 1727204199.05525: variable 'ansible_distribution_major_version' from source: facts 41016 1727204199.05534: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204199.05539: variable 'omit' from source: magic vars 41016 1727204199.05555: variable 'omit' from source: magic vars 41016 1727204199.05579: variable 'omit' from source: magic vars 41016 1727204199.05612: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204199.05640: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204199.05655: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204199.05669: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204199.05679: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204199.05716: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204199.05719: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204199.05721: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204199.05863: Set connection var ansible_shell_executable to /bin/sh 41016 1727204199.05867: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204199.05870: Set connection var ansible_shell_type to sh 41016 1727204199.05873: Set connection var ansible_timeout to 10 41016 1727204199.05877: Set connection var ansible_pipelining to False 41016 1727204199.05879: Set connection var ansible_connection to ssh 41016 1727204199.05882: variable 'ansible_shell_executable' from source: unknown 41016 1727204199.05884: variable 'ansible_connection' from source: unknown 41016 1727204199.05886: variable 'ansible_module_compression' from source: unknown 41016 1727204199.05889: variable 'ansible_shell_type' from source: unknown 41016 1727204199.05891: variable 'ansible_shell_executable' from source: unknown 41016 1727204199.05893: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204199.06081: variable 'ansible_pipelining' from source: unknown 41016 1727204199.06086: variable 'ansible_timeout' from source: unknown 41016 1727204199.06089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204199.06107: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204199.06115: variable 'omit' from source: magic vars 41016 1727204199.06117: starting attempt loop 41016 1727204199.06120: running the handler 41016 1727204199.06143: _low_level_execute_command(): starting 41016 1727204199.06157: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204199.06920: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204199.07025: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204199.07052: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204199.07163: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204199.08941: stdout chunk (state=3): >>>/root <<< 41016 1727204199.09098: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204199.09101: stdout chunk (state=3): >>><<< 41016 1727204199.09104: stderr chunk (state=3): >>><<< 41016 1727204199.09126: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204199.09216: _low_level_execute_command(): starting 41016 1727204199.09220: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204199.0913136-42803-209020195917645 `" && echo ansible-tmp-1727204199.0913136-42803-209020195917645="` echo /root/.ansible/tmp/ansible-tmp-1727204199.0913136-42803-209020195917645 `" ) && sleep 0' 41016 1727204199.09682: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204199.09694: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204199.09706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204199.09730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204199.09791: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204199.09834: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204199.09849: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204199.09892: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204199.09994: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204199.12097: stdout chunk (state=3): >>>ansible-tmp-1727204199.0913136-42803-209020195917645=/root/.ansible/tmp/ansible-tmp-1727204199.0913136-42803-209020195917645 <<< 41016 1727204199.12207: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204199.12231: stderr chunk (state=3): >>><<< 41016 1727204199.12235: stdout chunk (state=3): >>><<< 41016 1727204199.12251: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204199.0913136-42803-209020195917645=/root/.ansible/tmp/ansible-tmp-1727204199.0913136-42803-209020195917645 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204199.12311: variable 'ansible_module_compression' from source: unknown 41016 1727204199.12392: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-410168h8uvyln/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41016 1727204199.12396: variable 'ansible_facts' from source: unknown 41016 1727204199.12504: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204199.0913136-42803-209020195917645/AnsiballZ_command.py 41016 1727204199.12687: Sending initial data 41016 1727204199.12691: Sent initial data (156 bytes) 41016 1727204199.13120: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204199.13123: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204199.13134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204199.13229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204199.13267: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204199.13353: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204199.15070: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204199.15156: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204199.15234: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmpdfvt8288 /root/.ansible/tmp/ansible-tmp-1727204199.0913136-42803-209020195917645/AnsiballZ_command.py <<< 41016 1727204199.15237: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204199.0913136-42803-209020195917645/AnsiballZ_command.py" <<< 41016 1727204199.15308: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmpdfvt8288" to remote "/root/.ansible/tmp/ansible-tmp-1727204199.0913136-42803-209020195917645/AnsiballZ_command.py" <<< 41016 1727204199.15315: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204199.0913136-42803-209020195917645/AnsiballZ_command.py" <<< 41016 1727204199.15995: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204199.16035: stderr chunk (state=3): >>><<< 41016 1727204199.16038: stdout chunk (state=3): >>><<< 41016 1727204199.16056: done transferring module to remote 41016 1727204199.16064: _low_level_execute_command(): starting 41016 1727204199.16069: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204199.0913136-42803-209020195917645/ /root/.ansible/tmp/ansible-tmp-1727204199.0913136-42803-209020195917645/AnsiballZ_command.py && sleep 0' 41016 1727204199.16479: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204199.16513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204199.16517: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204199.16519: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 41016 1727204199.16521: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204199.16523: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204199.16571: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204199.16579: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204199.16653: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204199.18584: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204199.18613: stderr chunk (state=3): >>><<< 41016 1727204199.18616: stdout chunk (state=3): >>><<< 41016 1727204199.18629: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204199.18632: _low_level_execute_command(): starting 41016 1727204199.18636: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204199.0913136-42803-209020195917645/AnsiballZ_command.py && sleep 0' 41016 1727204199.19051: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204199.19089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204199.19092: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 41016 1727204199.19094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204199.19096: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204199.19098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 41016 1727204199.19100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204199.19152: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204199.19159: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204199.19161: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204199.19241: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204199.36184: stdout chunk (state=3): >>> {"changed": true, "stdout": "2001:db6::4 via 2001:db8::1 dev ethtest0 proto static metric 2 pref medium\n2001:db8::/32 dev ethtest0 proto kernel metric 103 pref medium\n2001:db8::/32 dev ethtest1 proto kernel metric 104 pref medium\nfe80::/64 dev peerethtest0 proto kernel metric 256 pref medium\nfe80::/64 dev peerethtest1 proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev ethtest0 proto kernel metric 1024 pref medium\nfe80::/64 dev ethtest1 proto kernel metric 1024 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-24 14:56:39.354281", "end": "2024-09-24 14:56:39.358371", "delta": "0:00:00.004090", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41016 1727204199.37838: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204199.37903: stderr chunk (state=3): >>>Shared connection to 10.31.14.47 closed. <<< 41016 1727204199.37918: stdout chunk (state=3): >>><<< 41016 1727204199.37927: stderr chunk (state=3): >>><<< 41016 1727204199.37952: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "2001:db6::4 via 2001:db8::1 dev ethtest0 proto static metric 2 pref medium\n2001:db8::/32 dev ethtest0 proto kernel metric 103 pref medium\n2001:db8::/32 dev ethtest1 proto kernel metric 104 pref medium\nfe80::/64 dev peerethtest0 proto kernel metric 256 pref medium\nfe80::/64 dev peerethtest1 proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev ethtest0 proto kernel metric 1024 pref medium\nfe80::/64 dev ethtest1 proto kernel metric 1024 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-24 14:56:39.354281", "end": "2024-09-24 14:56:39.358371", "delta": "0:00:00.004090", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204199.38004: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204199.0913136-42803-209020195917645/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204199.38025: _low_level_execute_command(): starting 41016 1727204199.38034: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204199.0913136-42803-209020195917645/ > /dev/null 2>&1 && sleep 0' 41016 1727204199.38652: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204199.38667: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204199.38687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204199.38706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204199.38747: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 41016 1727204199.38761: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204199.38791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204199.38867: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204199.38890: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204199.38918: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204199.39032: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204199.41069: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204199.41104: stdout chunk (state=3): >>><<< 41016 1727204199.41107: stderr chunk (state=3): >>><<< 41016 1727204199.41281: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204199.41285: handler run complete 41016 1727204199.41287: Evaluated conditional (False): False 41016 1727204199.41289: attempt loop complete, returning result 41016 1727204199.41291: _execute() done 41016 1727204199.41293: dumping result to json 41016 1727204199.41295: done dumping result, returning 41016 1727204199.41297: done running TaskExecutor() for managed-node1/TASK: Get the IPv6 routes from the route table main [028d2410-947f-12d5-0ec4-000000000062] 41016 1727204199.41299: sending task result for task 028d2410-947f-12d5-0ec4-000000000062 41016 1727204199.41371: done sending task result for task 028d2410-947f-12d5-0ec4-000000000062 41016 1727204199.41377: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "ip", "-6", "route" ], "delta": "0:00:00.004090", "end": "2024-09-24 14:56:39.358371", "rc": 0, "start": "2024-09-24 14:56:39.354281" } STDOUT: 2001:db6::4 via 2001:db8::1 dev ethtest0 proto static metric 2 pref medium 2001:db8::/32 dev ethtest0 proto kernel metric 103 pref medium 2001:db8::/32 dev ethtest1 proto kernel metric 104 pref medium fe80::/64 dev peerethtest0 proto kernel metric 256 pref medium fe80::/64 dev peerethtest1 proto kernel metric 256 pref medium fe80::/64 dev eth0 proto kernel metric 1024 pref medium fe80::/64 dev ethtest0 proto kernel metric 1024 pref medium fe80::/64 dev ethtest1 proto kernel metric 1024 pref medium 41016 1727204199.41461: no more pending results, returning what we have 41016 1727204199.41465: results queue empty 41016 1727204199.41466: checking for any_errors_fatal 41016 1727204199.41472: done checking for any_errors_fatal 41016 1727204199.41473: checking for max_fail_percentage 41016 1727204199.41477: done checking for max_fail_percentage 41016 1727204199.41478: checking to see if all hosts have failed and the running result is not ok 41016 1727204199.41479: done checking to see if all hosts have failed 41016 1727204199.41480: getting the remaining hosts for this loop 41016 1727204199.41482: done getting the remaining hosts for this loop 41016 1727204199.41485: getting the next task for host managed-node1 41016 1727204199.41492: done getting next task for host managed-node1 41016 1727204199.41495: ^ task is: TASK: Assert that the route table main contains the specified IPv6 routes 41016 1727204199.41497: ^ state is: HOST STATE: block=3, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204199.41501: getting variables 41016 1727204199.41503: in VariableManager get_vars() 41016 1727204199.41548: Calling all_inventory to load vars for managed-node1 41016 1727204199.41551: Calling groups_inventory to load vars for managed-node1 41016 1727204199.41554: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204199.41565: Calling all_plugins_play to load vars for managed-node1 41016 1727204199.41568: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204199.41571: Calling groups_plugins_play to load vars for managed-node1 41016 1727204199.43483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204199.44350: done with get_vars() 41016 1727204199.44365: done getting variables 41016 1727204199.44412: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the route table main contains the specified IPv6 routes] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:94 Tuesday 24 September 2024 14:56:39 -0400 (0:00:00.399) 0:00:23.120 ***** 41016 1727204199.44434: entering _queue_task() for managed-node1/assert 41016 1727204199.44670: worker is 1 (out of 1 available) 41016 1727204199.44683: exiting _queue_task() for managed-node1/assert 41016 1727204199.44695: done queuing things up, now waiting for results queue to drain 41016 1727204199.44697: waiting for pending results... 41016 1727204199.44907: running TaskExecutor() for managed-node1/TASK: Assert that the route table main contains the specified IPv6 routes 41016 1727204199.45122: in run() - task 028d2410-947f-12d5-0ec4-000000000063 41016 1727204199.45125: variable 'ansible_search_path' from source: unknown 41016 1727204199.45128: calling self._execute() 41016 1727204199.45381: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204199.45385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204199.45388: variable 'omit' from source: magic vars 41016 1727204199.45634: variable 'ansible_distribution_major_version' from source: facts 41016 1727204199.45645: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204199.45651: variable 'omit' from source: magic vars 41016 1727204199.45674: variable 'omit' from source: magic vars 41016 1727204199.45733: variable 'omit' from source: magic vars 41016 1727204199.45757: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204199.45801: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204199.45825: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204199.45842: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204199.45853: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204199.45886: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204199.45890: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204199.45892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204199.45996: Set connection var ansible_shell_executable to /bin/sh 41016 1727204199.46000: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204199.46013: Set connection var ansible_shell_type to sh 41016 1727204199.46017: Set connection var ansible_timeout to 10 41016 1727204199.46027: Set connection var ansible_pipelining to False 41016 1727204199.46048: Set connection var ansible_connection to ssh 41016 1727204199.46064: variable 'ansible_shell_executable' from source: unknown 41016 1727204199.46072: variable 'ansible_connection' from source: unknown 41016 1727204199.46077: variable 'ansible_module_compression' from source: unknown 41016 1727204199.46080: variable 'ansible_shell_type' from source: unknown 41016 1727204199.46082: variable 'ansible_shell_executable' from source: unknown 41016 1727204199.46085: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204199.46087: variable 'ansible_pipelining' from source: unknown 41016 1727204199.46089: variable 'ansible_timeout' from source: unknown 41016 1727204199.46094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204199.46198: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204199.46211: variable 'omit' from source: magic vars 41016 1727204199.46215: starting attempt loop 41016 1727204199.46218: running the handler 41016 1727204199.46332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41016 1727204199.46499: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41016 1727204199.46531: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41016 1727204199.46589: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41016 1727204199.46616: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41016 1727204199.46675: variable 'route_table_main_ipv6' from source: set_fact 41016 1727204199.46704: Evaluated conditional (route_table_main_ipv6.stdout is search("2001:db6::4 via 2001:db8::1 dev ethtest0\s+(proto static )?metric 2")): True 41016 1727204199.46707: handler run complete 41016 1727204199.46721: attempt loop complete, returning result 41016 1727204199.46724: _execute() done 41016 1727204199.46727: dumping result to json 41016 1727204199.46729: done dumping result, returning 41016 1727204199.46734: done running TaskExecutor() for managed-node1/TASK: Assert that the route table main contains the specified IPv6 routes [028d2410-947f-12d5-0ec4-000000000063] 41016 1727204199.46738: sending task result for task 028d2410-947f-12d5-0ec4-000000000063 41016 1727204199.46817: done sending task result for task 028d2410-947f-12d5-0ec4-000000000063 41016 1727204199.46821: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 41016 1727204199.46866: no more pending results, returning what we have 41016 1727204199.46870: results queue empty 41016 1727204199.46871: checking for any_errors_fatal 41016 1727204199.46885: done checking for any_errors_fatal 41016 1727204199.46886: checking for max_fail_percentage 41016 1727204199.46888: done checking for max_fail_percentage 41016 1727204199.46888: checking to see if all hosts have failed and the running result is not ok 41016 1727204199.46889: done checking to see if all hosts have failed 41016 1727204199.46890: getting the remaining hosts for this loop 41016 1727204199.46891: done getting the remaining hosts for this loop 41016 1727204199.46894: getting the next task for host managed-node1 41016 1727204199.46900: done getting next task for host managed-node1 41016 1727204199.46902: ^ task is: TASK: Get the interface1 MAC address 41016 1727204199.46904: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204199.46907: getting variables 41016 1727204199.46911: in VariableManager get_vars() 41016 1727204199.46950: Calling all_inventory to load vars for managed-node1 41016 1727204199.46954: Calling groups_inventory to load vars for managed-node1 41016 1727204199.46956: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204199.46966: Calling all_plugins_play to load vars for managed-node1 41016 1727204199.46968: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204199.46970: Calling groups_plugins_play to load vars for managed-node1 41016 1727204199.47742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204199.49124: done with get_vars() 41016 1727204199.49146: done getting variables 41016 1727204199.49208: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the interface1 MAC address] ****************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:99 Tuesday 24 September 2024 14:56:39 -0400 (0:00:00.048) 0:00:23.168 ***** 41016 1727204199.49238: entering _queue_task() for managed-node1/command 41016 1727204199.49535: worker is 1 (out of 1 available) 41016 1727204199.49548: exiting _queue_task() for managed-node1/command 41016 1727204199.49562: done queuing things up, now waiting for results queue to drain 41016 1727204199.49563: waiting for pending results... 41016 1727204199.49751: running TaskExecutor() for managed-node1/TASK: Get the interface1 MAC address 41016 1727204199.49818: in run() - task 028d2410-947f-12d5-0ec4-000000000064 41016 1727204199.49829: variable 'ansible_search_path' from source: unknown 41016 1727204199.49857: calling self._execute() 41016 1727204199.49939: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204199.49943: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204199.49951: variable 'omit' from source: magic vars 41016 1727204199.50239: variable 'ansible_distribution_major_version' from source: facts 41016 1727204199.50249: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204199.50255: variable 'omit' from source: magic vars 41016 1727204199.50273: variable 'omit' from source: magic vars 41016 1727204199.50344: variable 'interface1' from source: play vars 41016 1727204199.50360: variable 'omit' from source: magic vars 41016 1727204199.50394: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204199.50421: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204199.50438: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204199.50453: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204199.50462: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204199.50492: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204199.50495: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204199.50498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204199.50567: Set connection var ansible_shell_executable to /bin/sh 41016 1727204199.50579: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204199.50582: Set connection var ansible_shell_type to sh 41016 1727204199.50584: Set connection var ansible_timeout to 10 41016 1727204199.50590: Set connection var ansible_pipelining to False 41016 1727204199.50596: Set connection var ansible_connection to ssh 41016 1727204199.50616: variable 'ansible_shell_executable' from source: unknown 41016 1727204199.50619: variable 'ansible_connection' from source: unknown 41016 1727204199.50622: variable 'ansible_module_compression' from source: unknown 41016 1727204199.50624: variable 'ansible_shell_type' from source: unknown 41016 1727204199.50626: variable 'ansible_shell_executable' from source: unknown 41016 1727204199.50628: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204199.50631: variable 'ansible_pipelining' from source: unknown 41016 1727204199.50633: variable 'ansible_timeout' from source: unknown 41016 1727204199.50638: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204199.50741: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204199.50749: variable 'omit' from source: magic vars 41016 1727204199.50755: starting attempt loop 41016 1727204199.50758: running the handler 41016 1727204199.50771: _low_level_execute_command(): starting 41016 1727204199.50778: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204199.51485: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204199.51489: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204199.51565: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204199.53343: stdout chunk (state=3): >>>/root <<< 41016 1727204199.53444: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204199.53469: stderr chunk (state=3): >>><<< 41016 1727204199.53472: stdout chunk (state=3): >>><<< 41016 1727204199.53496: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204199.53509: _low_level_execute_command(): starting 41016 1727204199.53518: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204199.534958-42822-117763225870125 `" && echo ansible-tmp-1727204199.534958-42822-117763225870125="` echo /root/.ansible/tmp/ansible-tmp-1727204199.534958-42822-117763225870125 `" ) && sleep 0' 41016 1727204199.53936: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204199.53940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204199.53943: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204199.53952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204199.53998: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204199.54005: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204199.54083: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204199.56206: stdout chunk (state=3): >>>ansible-tmp-1727204199.534958-42822-117763225870125=/root/.ansible/tmp/ansible-tmp-1727204199.534958-42822-117763225870125 <<< 41016 1727204199.56317: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204199.56341: stderr chunk (state=3): >>><<< 41016 1727204199.56344: stdout chunk (state=3): >>><<< 41016 1727204199.56358: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204199.534958-42822-117763225870125=/root/.ansible/tmp/ansible-tmp-1727204199.534958-42822-117763225870125 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204199.56388: variable 'ansible_module_compression' from source: unknown 41016 1727204199.56433: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-410168h8uvyln/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41016 1727204199.56466: variable 'ansible_facts' from source: unknown 41016 1727204199.56518: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204199.534958-42822-117763225870125/AnsiballZ_command.py 41016 1727204199.56612: Sending initial data 41016 1727204199.56615: Sent initial data (155 bytes) 41016 1727204199.57041: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204199.57044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204199.57047: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204199.57049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204199.57099: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204199.57103: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204199.57191: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204199.58955: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 41016 1727204199.58959: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204199.59031: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204199.59110: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmpqu3xrlmh /root/.ansible/tmp/ansible-tmp-1727204199.534958-42822-117763225870125/AnsiballZ_command.py <<< 41016 1727204199.59114: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204199.534958-42822-117763225870125/AnsiballZ_command.py" <<< 41016 1727204199.59181: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmpqu3xrlmh" to remote "/root/.ansible/tmp/ansible-tmp-1727204199.534958-42822-117763225870125/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204199.534958-42822-117763225870125/AnsiballZ_command.py" <<< 41016 1727204199.59845: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204199.59884: stderr chunk (state=3): >>><<< 41016 1727204199.59887: stdout chunk (state=3): >>><<< 41016 1727204199.59926: done transferring module to remote 41016 1727204199.59934: _low_level_execute_command(): starting 41016 1727204199.59939: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204199.534958-42822-117763225870125/ /root/.ansible/tmp/ansible-tmp-1727204199.534958-42822-117763225870125/AnsiballZ_command.py && sleep 0' 41016 1727204199.60334: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204199.60342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204199.60363: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204199.60366: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204199.60368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204199.60427: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204199.60437: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204199.60440: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204199.60512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204199.62506: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204199.62529: stderr chunk (state=3): >>><<< 41016 1727204199.62532: stdout chunk (state=3): >>><<< 41016 1727204199.62544: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204199.62547: _low_level_execute_command(): starting 41016 1727204199.62551: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204199.534958-42822-117763225870125/AnsiballZ_command.py && sleep 0' 41016 1727204199.62936: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204199.62944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204199.62966: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204199.62969: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204199.62971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204199.63030: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204199.63034: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204199.63115: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204199.80054: stdout chunk (state=3): >>> {"changed": true, "stdout": "ca:90:ed:ea:28:3e", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/ethtest1/address"], "start": "2024-09-24 14:56:39.795435", "end": "2024-09-24 14:56:39.798670", "delta": "0:00:00.003235", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/ethtest1/address", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41016 1727204199.82084: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 41016 1727204199.82089: stdout chunk (state=3): >>><<< 41016 1727204199.82092: stderr chunk (state=3): >>><<< 41016 1727204199.82094: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "ca:90:ed:ea:28:3e", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/ethtest1/address"], "start": "2024-09-24 14:56:39.795435", "end": "2024-09-24 14:56:39.798670", "delta": "0:00:00.003235", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/ethtest1/address", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204199.82097: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/ethtest1/address', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204199.534958-42822-117763225870125/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204199.82100: _low_level_execute_command(): starting 41016 1727204199.82102: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204199.534958-42822-117763225870125/ > /dev/null 2>&1 && sleep 0' 41016 1727204199.82897: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204199.82954: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204199.82981: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204199.83015: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204199.83133: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204199.85165: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204199.85185: stderr chunk (state=3): >>><<< 41016 1727204199.85194: stdout chunk (state=3): >>><<< 41016 1727204199.85224: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204199.85237: handler run complete 41016 1727204199.85282: Evaluated conditional (False): False 41016 1727204199.85285: attempt loop complete, returning result 41016 1727204199.85287: _execute() done 41016 1727204199.85380: dumping result to json 41016 1727204199.85383: done dumping result, returning 41016 1727204199.85388: done running TaskExecutor() for managed-node1/TASK: Get the interface1 MAC address [028d2410-947f-12d5-0ec4-000000000064] 41016 1727204199.85390: sending task result for task 028d2410-947f-12d5-0ec4-000000000064 41016 1727204199.85464: done sending task result for task 028d2410-947f-12d5-0ec4-000000000064 41016 1727204199.85467: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "cat", "/sys/class/net/ethtest1/address" ], "delta": "0:00:00.003235", "end": "2024-09-24 14:56:39.798670", "rc": 0, "start": "2024-09-24 14:56:39.795435" } STDOUT: ca:90:ed:ea:28:3e 41016 1727204199.85749: no more pending results, returning what we have 41016 1727204199.85753: results queue empty 41016 1727204199.85754: checking for any_errors_fatal 41016 1727204199.85761: done checking for any_errors_fatal 41016 1727204199.85762: checking for max_fail_percentage 41016 1727204199.85763: done checking for max_fail_percentage 41016 1727204199.85764: checking to see if all hosts have failed and the running result is not ok 41016 1727204199.85765: done checking to see if all hosts have failed 41016 1727204199.85766: getting the remaining hosts for this loop 41016 1727204199.85767: done getting the remaining hosts for this loop 41016 1727204199.85770: getting the next task for host managed-node1 41016 1727204199.85784: done getting next task for host managed-node1 41016 1727204199.85791: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 41016 1727204199.85793: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204199.85814: getting variables 41016 1727204199.85816: in VariableManager get_vars() 41016 1727204199.85856: Calling all_inventory to load vars for managed-node1 41016 1727204199.85858: Calling groups_inventory to load vars for managed-node1 41016 1727204199.85861: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204199.85871: Calling all_plugins_play to load vars for managed-node1 41016 1727204199.85874: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204199.86090: Calling groups_plugins_play to load vars for managed-node1 41016 1727204199.87787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204199.89301: done with get_vars() 41016 1727204199.89329: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:56:39 -0400 (0:00:00.401) 0:00:23.570 ***** 41016 1727204199.89435: entering _queue_task() for managed-node1/include_tasks 41016 1727204199.89912: worker is 1 (out of 1 available) 41016 1727204199.89924: exiting _queue_task() for managed-node1/include_tasks 41016 1727204199.89935: done queuing things up, now waiting for results queue to drain 41016 1727204199.89936: waiting for pending results... 41016 1727204199.90289: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 41016 1727204199.90331: in run() - task 028d2410-947f-12d5-0ec4-00000000006c 41016 1727204199.90355: variable 'ansible_search_path' from source: unknown 41016 1727204199.90365: variable 'ansible_search_path' from source: unknown 41016 1727204199.90422: calling self._execute() 41016 1727204199.90540: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204199.90557: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204199.90572: variable 'omit' from source: magic vars 41016 1727204199.91018: variable 'ansible_distribution_major_version' from source: facts 41016 1727204199.91045: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204199.91150: _execute() done 41016 1727204199.91153: dumping result to json 41016 1727204199.91156: done dumping result, returning 41016 1727204199.91159: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [028d2410-947f-12d5-0ec4-00000000006c] 41016 1727204199.91161: sending task result for task 028d2410-947f-12d5-0ec4-00000000006c 41016 1727204199.91368: done sending task result for task 028d2410-947f-12d5-0ec4-00000000006c 41016 1727204199.91371: WORKER PROCESS EXITING 41016 1727204199.91421: no more pending results, returning what we have 41016 1727204199.91427: in VariableManager get_vars() 41016 1727204199.91474: Calling all_inventory to load vars for managed-node1 41016 1727204199.91479: Calling groups_inventory to load vars for managed-node1 41016 1727204199.91482: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204199.91495: Calling all_plugins_play to load vars for managed-node1 41016 1727204199.91498: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204199.91501: Calling groups_plugins_play to load vars for managed-node1 41016 1727204199.93072: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204199.94853: done with get_vars() 41016 1727204199.94871: variable 'ansible_search_path' from source: unknown 41016 1727204199.94873: variable 'ansible_search_path' from source: unknown 41016 1727204199.94926: we have included files to process 41016 1727204199.94928: generating all_blocks data 41016 1727204199.94931: done generating all_blocks data 41016 1727204199.94936: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41016 1727204199.94937: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41016 1727204199.94939: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41016 1727204199.95570: done processing included file 41016 1727204199.95572: iterating over new_blocks loaded from include file 41016 1727204199.95573: in VariableManager get_vars() 41016 1727204199.95600: done with get_vars() 41016 1727204199.95602: filtering new block on tags 41016 1727204199.95624: done filtering new block on tags 41016 1727204199.95627: in VariableManager get_vars() 41016 1727204199.95652: done with get_vars() 41016 1727204199.95653: filtering new block on tags 41016 1727204199.95680: done filtering new block on tags 41016 1727204199.95683: in VariableManager get_vars() 41016 1727204199.95706: done with get_vars() 41016 1727204199.95708: filtering new block on tags 41016 1727204199.95729: done filtering new block on tags 41016 1727204199.95732: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node1 41016 1727204199.95737: extending task lists for all hosts with included blocks 41016 1727204199.96622: done extending task lists 41016 1727204199.96623: done processing included files 41016 1727204199.96624: results queue empty 41016 1727204199.96625: checking for any_errors_fatal 41016 1727204199.96630: done checking for any_errors_fatal 41016 1727204199.96631: checking for max_fail_percentage 41016 1727204199.96632: done checking for max_fail_percentage 41016 1727204199.96632: checking to see if all hosts have failed and the running result is not ok 41016 1727204199.96633: done checking to see if all hosts have failed 41016 1727204199.96634: getting the remaining hosts for this loop 41016 1727204199.96635: done getting the remaining hosts for this loop 41016 1727204199.96637: getting the next task for host managed-node1 41016 1727204199.96641: done getting next task for host managed-node1 41016 1727204199.96643: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 41016 1727204199.96650: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204199.96659: getting variables 41016 1727204199.96660: in VariableManager get_vars() 41016 1727204199.96674: Calling all_inventory to load vars for managed-node1 41016 1727204199.96678: Calling groups_inventory to load vars for managed-node1 41016 1727204199.96680: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204199.96685: Calling all_plugins_play to load vars for managed-node1 41016 1727204199.96687: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204199.96689: Calling groups_plugins_play to load vars for managed-node1 41016 1727204199.97889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204199.99468: done with get_vars() 41016 1727204199.99492: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:56:39 -0400 (0:00:00.101) 0:00:23.671 ***** 41016 1727204199.99570: entering _queue_task() for managed-node1/setup 41016 1727204200.00024: worker is 1 (out of 1 available) 41016 1727204200.00036: exiting _queue_task() for managed-node1/setup 41016 1727204200.00045: done queuing things up, now waiting for results queue to drain 41016 1727204200.00047: waiting for pending results... 41016 1727204200.00359: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 41016 1727204200.00455: in run() - task 028d2410-947f-12d5-0ec4-000000000563 41016 1727204200.00474: variable 'ansible_search_path' from source: unknown 41016 1727204200.00485: variable 'ansible_search_path' from source: unknown 41016 1727204200.00527: calling self._execute() 41016 1727204200.00637: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204200.00648: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204200.00662: variable 'omit' from source: magic vars 41016 1727204200.01057: variable 'ansible_distribution_major_version' from source: facts 41016 1727204200.01072: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204200.01330: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41016 1727204200.03724: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41016 1727204200.03794: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41016 1727204200.03839: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41016 1727204200.03880: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41016 1727204200.03982: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41016 1727204200.04012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204200.04049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204200.04084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204200.04141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204200.04161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204200.04282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204200.04285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204200.04288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204200.04341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204200.04361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204200.04549: variable '__network_required_facts' from source: role '' defaults 41016 1727204200.04564: variable 'ansible_facts' from source: unknown 41016 1727204200.05331: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 41016 1727204200.05340: when evaluation is False, skipping this task 41016 1727204200.05348: _execute() done 41016 1727204200.05408: dumping result to json 41016 1727204200.05414: done dumping result, returning 41016 1727204200.05417: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [028d2410-947f-12d5-0ec4-000000000563] 41016 1727204200.05420: sending task result for task 028d2410-947f-12d5-0ec4-000000000563 41016 1727204200.05493: done sending task result for task 028d2410-947f-12d5-0ec4-000000000563 41016 1727204200.05497: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41016 1727204200.05549: no more pending results, returning what we have 41016 1727204200.05553: results queue empty 41016 1727204200.05554: checking for any_errors_fatal 41016 1727204200.05556: done checking for any_errors_fatal 41016 1727204200.05556: checking for max_fail_percentage 41016 1727204200.05558: done checking for max_fail_percentage 41016 1727204200.05559: checking to see if all hosts have failed and the running result is not ok 41016 1727204200.05560: done checking to see if all hosts have failed 41016 1727204200.05561: getting the remaining hosts for this loop 41016 1727204200.05562: done getting the remaining hosts for this loop 41016 1727204200.05566: getting the next task for host managed-node1 41016 1727204200.05578: done getting next task for host managed-node1 41016 1727204200.05583: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 41016 1727204200.05587: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204200.05606: getting variables 41016 1727204200.05608: in VariableManager get_vars() 41016 1727204200.05657: Calling all_inventory to load vars for managed-node1 41016 1727204200.05660: Calling groups_inventory to load vars for managed-node1 41016 1727204200.05663: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204200.05673: Calling all_plugins_play to load vars for managed-node1 41016 1727204200.05895: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204200.05899: Calling groups_plugins_play to load vars for managed-node1 41016 1727204200.07525: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204200.09112: done with get_vars() 41016 1727204200.09135: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:56:40 -0400 (0:00:00.096) 0:00:23.768 ***** 41016 1727204200.09237: entering _queue_task() for managed-node1/stat 41016 1727204200.09573: worker is 1 (out of 1 available) 41016 1727204200.09699: exiting _queue_task() for managed-node1/stat 41016 1727204200.09712: done queuing things up, now waiting for results queue to drain 41016 1727204200.09713: waiting for pending results... 41016 1727204200.09998: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 41016 1727204200.10182: in run() - task 028d2410-947f-12d5-0ec4-000000000565 41016 1727204200.10186: variable 'ansible_search_path' from source: unknown 41016 1727204200.10188: variable 'ansible_search_path' from source: unknown 41016 1727204200.10191: calling self._execute() 41016 1727204200.10249: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204200.10260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204200.10272: variable 'omit' from source: magic vars 41016 1727204200.10663: variable 'ansible_distribution_major_version' from source: facts 41016 1727204200.10681: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204200.10852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41016 1727204200.11132: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41016 1727204200.11186: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41016 1727204200.11225: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41016 1727204200.11262: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41016 1727204200.11355: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41016 1727204200.11386: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41016 1727204200.11426: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204200.11512: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41016 1727204200.11555: variable '__network_is_ostree' from source: set_fact 41016 1727204200.11566: Evaluated conditional (not __network_is_ostree is defined): False 41016 1727204200.11573: when evaluation is False, skipping this task 41016 1727204200.11582: _execute() done 41016 1727204200.11588: dumping result to json 41016 1727204200.11596: done dumping result, returning 41016 1727204200.11605: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [028d2410-947f-12d5-0ec4-000000000565] 41016 1727204200.11623: sending task result for task 028d2410-947f-12d5-0ec4-000000000565 skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 41016 1727204200.11883: no more pending results, returning what we have 41016 1727204200.11888: results queue empty 41016 1727204200.11889: checking for any_errors_fatal 41016 1727204200.11899: done checking for any_errors_fatal 41016 1727204200.11900: checking for max_fail_percentage 41016 1727204200.11901: done checking for max_fail_percentage 41016 1727204200.11903: checking to see if all hosts have failed and the running result is not ok 41016 1727204200.11904: done checking to see if all hosts have failed 41016 1727204200.11905: getting the remaining hosts for this loop 41016 1727204200.11906: done getting the remaining hosts for this loop 41016 1727204200.11913: getting the next task for host managed-node1 41016 1727204200.11921: done getting next task for host managed-node1 41016 1727204200.11925: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 41016 1727204200.11929: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204200.11950: getting variables 41016 1727204200.11951: in VariableManager get_vars() 41016 1727204200.11994: Calling all_inventory to load vars for managed-node1 41016 1727204200.11997: Calling groups_inventory to load vars for managed-node1 41016 1727204200.12000: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204200.12013: Calling all_plugins_play to load vars for managed-node1 41016 1727204200.12016: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204200.12022: Calling groups_plugins_play to load vars for managed-node1 41016 1727204200.12603: done sending task result for task 028d2410-947f-12d5-0ec4-000000000565 41016 1727204200.12606: WORKER PROCESS EXITING 41016 1727204200.13616: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204200.15290: done with get_vars() 41016 1727204200.15316: done getting variables 41016 1727204200.15378: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:56:40 -0400 (0:00:00.061) 0:00:23.830 ***** 41016 1727204200.15415: entering _queue_task() for managed-node1/set_fact 41016 1727204200.15735: worker is 1 (out of 1 available) 41016 1727204200.15747: exiting _queue_task() for managed-node1/set_fact 41016 1727204200.15757: done queuing things up, now waiting for results queue to drain 41016 1727204200.15758: waiting for pending results... 41016 1727204200.16051: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 41016 1727204200.16228: in run() - task 028d2410-947f-12d5-0ec4-000000000566 41016 1727204200.16246: variable 'ansible_search_path' from source: unknown 41016 1727204200.16253: variable 'ansible_search_path' from source: unknown 41016 1727204200.16292: calling self._execute() 41016 1727204200.16397: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204200.16416: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204200.16436: variable 'omit' from source: magic vars 41016 1727204200.16808: variable 'ansible_distribution_major_version' from source: facts 41016 1727204200.16830: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204200.17031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41016 1727204200.17339: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41016 1727204200.17389: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41016 1727204200.17437: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41016 1727204200.17480: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41016 1727204200.17621: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41016 1727204200.17624: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41016 1727204200.17651: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204200.17685: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41016 1727204200.17792: variable '__network_is_ostree' from source: set_fact 41016 1727204200.17805: Evaluated conditional (not __network_is_ostree is defined): False 41016 1727204200.17817: when evaluation is False, skipping this task 41016 1727204200.17838: _execute() done 41016 1727204200.17841: dumping result to json 41016 1727204200.17948: done dumping result, returning 41016 1727204200.17952: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [028d2410-947f-12d5-0ec4-000000000566] 41016 1727204200.17955: sending task result for task 028d2410-947f-12d5-0ec4-000000000566 41016 1727204200.18029: done sending task result for task 028d2410-947f-12d5-0ec4-000000000566 41016 1727204200.18032: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 41016 1727204200.18104: no more pending results, returning what we have 41016 1727204200.18112: results queue empty 41016 1727204200.18114: checking for any_errors_fatal 41016 1727204200.18124: done checking for any_errors_fatal 41016 1727204200.18125: checking for max_fail_percentage 41016 1727204200.18127: done checking for max_fail_percentage 41016 1727204200.18128: checking to see if all hosts have failed and the running result is not ok 41016 1727204200.18129: done checking to see if all hosts have failed 41016 1727204200.18130: getting the remaining hosts for this loop 41016 1727204200.18131: done getting the remaining hosts for this loop 41016 1727204200.18135: getting the next task for host managed-node1 41016 1727204200.18145: done getting next task for host managed-node1 41016 1727204200.18150: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 41016 1727204200.18155: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204200.18183: getting variables 41016 1727204200.18185: in VariableManager get_vars() 41016 1727204200.18234: Calling all_inventory to load vars for managed-node1 41016 1727204200.18237: Calling groups_inventory to load vars for managed-node1 41016 1727204200.18240: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204200.18251: Calling all_plugins_play to load vars for managed-node1 41016 1727204200.18254: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204200.18257: Calling groups_plugins_play to load vars for managed-node1 41016 1727204200.20040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204200.21647: done with get_vars() 41016 1727204200.21669: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:56:40 -0400 (0:00:00.063) 0:00:23.893 ***** 41016 1727204200.21774: entering _queue_task() for managed-node1/service_facts 41016 1727204200.22128: worker is 1 (out of 1 available) 41016 1727204200.22140: exiting _queue_task() for managed-node1/service_facts 41016 1727204200.22152: done queuing things up, now waiting for results queue to drain 41016 1727204200.22154: waiting for pending results... 41016 1727204200.22461: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running 41016 1727204200.22716: in run() - task 028d2410-947f-12d5-0ec4-000000000568 41016 1727204200.22720: variable 'ansible_search_path' from source: unknown 41016 1727204200.22722: variable 'ansible_search_path' from source: unknown 41016 1727204200.22724: calling self._execute() 41016 1727204200.22804: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204200.22825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204200.22839: variable 'omit' from source: magic vars 41016 1727204200.23235: variable 'ansible_distribution_major_version' from source: facts 41016 1727204200.23257: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204200.23269: variable 'omit' from source: magic vars 41016 1727204200.23349: variable 'omit' from source: magic vars 41016 1727204200.23396: variable 'omit' from source: magic vars 41016 1727204200.23440: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204200.23583: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204200.23587: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204200.23589: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204200.23591: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204200.23593: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204200.23595: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204200.23597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204200.23704: Set connection var ansible_shell_executable to /bin/sh 41016 1727204200.23718: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204200.23729: Set connection var ansible_shell_type to sh 41016 1727204200.23738: Set connection var ansible_timeout to 10 41016 1727204200.23748: Set connection var ansible_pipelining to False 41016 1727204200.23759: Set connection var ansible_connection to ssh 41016 1727204200.23785: variable 'ansible_shell_executable' from source: unknown 41016 1727204200.23796: variable 'ansible_connection' from source: unknown 41016 1727204200.23804: variable 'ansible_module_compression' from source: unknown 41016 1727204200.23811: variable 'ansible_shell_type' from source: unknown 41016 1727204200.23912: variable 'ansible_shell_executable' from source: unknown 41016 1727204200.23915: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204200.23917: variable 'ansible_pipelining' from source: unknown 41016 1727204200.23919: variable 'ansible_timeout' from source: unknown 41016 1727204200.23921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204200.24038: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 41016 1727204200.24053: variable 'omit' from source: magic vars 41016 1727204200.24062: starting attempt loop 41016 1727204200.24069: running the handler 41016 1727204200.24086: _low_level_execute_command(): starting 41016 1727204200.24095: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204200.24901: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204200.24947: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204200.24973: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204200.25092: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204200.26913: stdout chunk (state=3): >>>/root <<< 41016 1727204200.27084: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204200.27088: stdout chunk (state=3): >>><<< 41016 1727204200.27091: stderr chunk (state=3): >>><<< 41016 1727204200.27115: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204200.27225: _low_level_execute_command(): starting 41016 1727204200.27230: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204200.2712388-42848-235902956573838 `" && echo ansible-tmp-1727204200.2712388-42848-235902956573838="` echo /root/.ansible/tmp/ansible-tmp-1727204200.2712388-42848-235902956573838 `" ) && sleep 0' 41016 1727204200.27830: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204200.27844: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204200.27860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204200.27881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204200.27901: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204200.27995: stderr chunk (state=3): >>>debug2: match not found <<< 41016 1727204200.28000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204200.28050: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204200.28067: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204200.28101: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204200.28218: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204200.30330: stdout chunk (state=3): >>>ansible-tmp-1727204200.2712388-42848-235902956573838=/root/.ansible/tmp/ansible-tmp-1727204200.2712388-42848-235902956573838 <<< 41016 1727204200.30512: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204200.30515: stdout chunk (state=3): >>><<< 41016 1727204200.30518: stderr chunk (state=3): >>><<< 41016 1727204200.30534: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204200.2712388-42848-235902956573838=/root/.ansible/tmp/ansible-tmp-1727204200.2712388-42848-235902956573838 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204200.30596: variable 'ansible_module_compression' from source: unknown 41016 1727204200.30680: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-410168h8uvyln/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 41016 1727204200.30704: variable 'ansible_facts' from source: unknown 41016 1727204200.30816: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204200.2712388-42848-235902956573838/AnsiballZ_service_facts.py 41016 1727204200.31192: Sending initial data 41016 1727204200.31195: Sent initial data (162 bytes) 41016 1727204200.32038: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204200.32087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204200.32132: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204200.32211: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204200.33972: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204200.34071: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204200.34162: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmpnasqv8h9 /root/.ansible/tmp/ansible-tmp-1727204200.2712388-42848-235902956573838/AnsiballZ_service_facts.py <<< 41016 1727204200.34166: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204200.2712388-42848-235902956573838/AnsiballZ_service_facts.py" <<< 41016 1727204200.34250: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmpnasqv8h9" to remote "/root/.ansible/tmp/ansible-tmp-1727204200.2712388-42848-235902956573838/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204200.2712388-42848-235902956573838/AnsiballZ_service_facts.py" <<< 41016 1727204200.35238: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204200.35278: stderr chunk (state=3): >>><<< 41016 1727204200.35288: stdout chunk (state=3): >>><<< 41016 1727204200.35401: done transferring module to remote 41016 1727204200.35405: _low_level_execute_command(): starting 41016 1727204200.35407: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204200.2712388-42848-235902956573838/ /root/.ansible/tmp/ansible-tmp-1727204200.2712388-42848-235902956573838/AnsiballZ_service_facts.py && sleep 0' 41016 1727204200.35991: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204200.36034: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204200.36051: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41016 1727204200.36090: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204200.36153: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204200.36179: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204200.36210: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204200.36297: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204200.38372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204200.38385: stdout chunk (state=3): >>><<< 41016 1727204200.38388: stderr chunk (state=3): >>><<< 41016 1727204200.38406: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204200.38496: _low_level_execute_command(): starting 41016 1727204200.38500: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204200.2712388-42848-235902956573838/AnsiballZ_service_facts.py && sleep 0' 41016 1727204200.39408: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204200.39423: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204200.39435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204200.39468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204200.39487: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41016 1727204200.39581: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204200.39604: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204200.39725: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204202.14925: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 41016 1727204202.14961: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 41016 1727204202.14981: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "st<<< 41016 1727204202.14987: stdout chunk (state=3): >>>atic", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 41016 1727204202.16895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 41016 1727204202.16899: stdout chunk (state=3): >>><<< 41016 1727204202.16901: stderr chunk (state=3): >>><<< 41016 1727204202.16905: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204202.17832: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204200.2712388-42848-235902956573838/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204202.17922: _low_level_execute_command(): starting 41016 1727204202.17926: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204200.2712388-42848-235902956573838/ > /dev/null 2>&1 && sleep 0' 41016 1727204202.18444: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204202.18458: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204202.18472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204202.18492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204202.18597: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204202.18622: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204202.18726: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204202.20784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204202.20801: stdout chunk (state=3): >>><<< 41016 1727204202.20814: stderr chunk (state=3): >>><<< 41016 1727204202.20834: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204202.20846: handler run complete 41016 1727204202.21057: variable 'ansible_facts' from source: unknown 41016 1727204202.21230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204202.21751: variable 'ansible_facts' from source: unknown 41016 1727204202.21900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204202.22127: attempt loop complete, returning result 41016 1727204202.22136: _execute() done 41016 1727204202.22142: dumping result to json 41016 1727204202.22216: done dumping result, returning 41016 1727204202.22227: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running [028d2410-947f-12d5-0ec4-000000000568] 41016 1727204202.22234: sending task result for task 028d2410-947f-12d5-0ec4-000000000568 41016 1727204202.23651: done sending task result for task 028d2410-947f-12d5-0ec4-000000000568 41016 1727204202.23654: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41016 1727204202.23789: no more pending results, returning what we have 41016 1727204202.23793: results queue empty 41016 1727204202.23794: checking for any_errors_fatal 41016 1727204202.23798: done checking for any_errors_fatal 41016 1727204202.23798: checking for max_fail_percentage 41016 1727204202.23800: done checking for max_fail_percentage 41016 1727204202.23801: checking to see if all hosts have failed and the running result is not ok 41016 1727204202.23802: done checking to see if all hosts have failed 41016 1727204202.23803: getting the remaining hosts for this loop 41016 1727204202.23804: done getting the remaining hosts for this loop 41016 1727204202.23807: getting the next task for host managed-node1 41016 1727204202.23813: done getting next task for host managed-node1 41016 1727204202.23816: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 41016 1727204202.23825: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204202.23836: getting variables 41016 1727204202.23838: in VariableManager get_vars() 41016 1727204202.23879: Calling all_inventory to load vars for managed-node1 41016 1727204202.23882: Calling groups_inventory to load vars for managed-node1 41016 1727204202.23885: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204202.23895: Calling all_plugins_play to load vars for managed-node1 41016 1727204202.23898: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204202.23901: Calling groups_plugins_play to load vars for managed-node1 41016 1727204202.25236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204202.26715: done with get_vars() 41016 1727204202.26736: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:56:42 -0400 (0:00:02.050) 0:00:25.944 ***** 41016 1727204202.26812: entering _queue_task() for managed-node1/package_facts 41016 1727204202.27073: worker is 1 (out of 1 available) 41016 1727204202.27089: exiting _queue_task() for managed-node1/package_facts 41016 1727204202.27100: done queuing things up, now waiting for results queue to drain 41016 1727204202.27102: waiting for pending results... 41016 1727204202.27290: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 41016 1727204202.27391: in run() - task 028d2410-947f-12d5-0ec4-000000000569 41016 1727204202.27403: variable 'ansible_search_path' from source: unknown 41016 1727204202.27407: variable 'ansible_search_path' from source: unknown 41016 1727204202.27443: calling self._execute() 41016 1727204202.27513: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204202.27516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204202.27523: variable 'omit' from source: magic vars 41016 1727204202.27804: variable 'ansible_distribution_major_version' from source: facts 41016 1727204202.27815: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204202.27819: variable 'omit' from source: magic vars 41016 1727204202.27870: variable 'omit' from source: magic vars 41016 1727204202.27899: variable 'omit' from source: magic vars 41016 1727204202.27931: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204202.27958: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204202.27973: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204202.27994: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204202.28001: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204202.28024: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204202.28028: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204202.28031: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204202.28103: Set connection var ansible_shell_executable to /bin/sh 41016 1727204202.28106: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204202.28112: Set connection var ansible_shell_type to sh 41016 1727204202.28115: Set connection var ansible_timeout to 10 41016 1727204202.28117: Set connection var ansible_pipelining to False 41016 1727204202.28123: Set connection var ansible_connection to ssh 41016 1727204202.28139: variable 'ansible_shell_executable' from source: unknown 41016 1727204202.28141: variable 'ansible_connection' from source: unknown 41016 1727204202.28144: variable 'ansible_module_compression' from source: unknown 41016 1727204202.28147: variable 'ansible_shell_type' from source: unknown 41016 1727204202.28149: variable 'ansible_shell_executable' from source: unknown 41016 1727204202.28151: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204202.28155: variable 'ansible_pipelining' from source: unknown 41016 1727204202.28157: variable 'ansible_timeout' from source: unknown 41016 1727204202.28162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204202.28347: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 41016 1727204202.28514: variable 'omit' from source: magic vars 41016 1727204202.28517: starting attempt loop 41016 1727204202.28519: running the handler 41016 1727204202.28522: _low_level_execute_command(): starting 41016 1727204202.28524: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204202.29124: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204202.29153: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204202.29268: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204202.31037: stdout chunk (state=3): >>>/root <<< 41016 1727204202.31168: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204202.31171: stdout chunk (state=3): >>><<< 41016 1727204202.31174: stderr chunk (state=3): >>><<< 41016 1727204202.31189: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204202.31200: _low_level_execute_command(): starting 41016 1727204202.31206: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204202.3118896-42909-27015740680896 `" && echo ansible-tmp-1727204202.3118896-42909-27015740680896="` echo /root/.ansible/tmp/ansible-tmp-1727204202.3118896-42909-27015740680896 `" ) && sleep 0' 41016 1727204202.31818: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204202.31918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204202.31923: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204202.31936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204202.31952: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204202.32038: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204202.34103: stdout chunk (state=3): >>>ansible-tmp-1727204202.3118896-42909-27015740680896=/root/.ansible/tmp/ansible-tmp-1727204202.3118896-42909-27015740680896 <<< 41016 1727204202.34241: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204202.34250: stderr chunk (state=3): >>><<< 41016 1727204202.34254: stdout chunk (state=3): >>><<< 41016 1727204202.34322: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204202.3118896-42909-27015740680896=/root/.ansible/tmp/ansible-tmp-1727204202.3118896-42909-27015740680896 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204202.34328: variable 'ansible_module_compression' from source: unknown 41016 1727204202.34407: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-410168h8uvyln/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 41016 1727204202.34439: variable 'ansible_facts' from source: unknown 41016 1727204202.34685: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204202.3118896-42909-27015740680896/AnsiballZ_package_facts.py 41016 1727204202.34824: Sending initial data 41016 1727204202.34990: Sent initial data (161 bytes) 41016 1727204202.35349: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204202.35361: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204202.35374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204202.35397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204202.35414: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204202.35424: stderr chunk (state=3): >>>debug2: match not found <<< 41016 1727204202.35440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204202.35462: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41016 1727204202.35496: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204202.35581: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204202.35594: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204202.35685: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204202.37424: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204202.37501: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204202.37572: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmpu45q9ck2 /root/.ansible/tmp/ansible-tmp-1727204202.3118896-42909-27015740680896/AnsiballZ_package_facts.py <<< 41016 1727204202.37582: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204202.3118896-42909-27015740680896/AnsiballZ_package_facts.py" <<< 41016 1727204202.37644: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmpu45q9ck2" to remote "/root/.ansible/tmp/ansible-tmp-1727204202.3118896-42909-27015740680896/AnsiballZ_package_facts.py" <<< 41016 1727204202.37649: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204202.3118896-42909-27015740680896/AnsiballZ_package_facts.py" <<< 41016 1727204202.39014: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204202.39017: stderr chunk (state=3): >>><<< 41016 1727204202.39020: stdout chunk (state=3): >>><<< 41016 1727204202.39022: done transferring module to remote 41016 1727204202.39087: _low_level_execute_command(): starting 41016 1727204202.39090: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204202.3118896-42909-27015740680896/ /root/.ansible/tmp/ansible-tmp-1727204202.3118896-42909-27015740680896/AnsiballZ_package_facts.py && sleep 0' 41016 1727204202.39537: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204202.39585: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204202.39589: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204202.39680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204202.41781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204202.41785: stdout chunk (state=3): >>><<< 41016 1727204202.41788: stderr chunk (state=3): >>><<< 41016 1727204202.41790: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204202.41793: _low_level_execute_command(): starting 41016 1727204202.41795: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204202.3118896-42909-27015740680896/AnsiballZ_package_facts.py && sleep 0' 41016 1727204202.42293: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204202.42318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204202.42324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204202.42381: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204202.42389: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204202.42471: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204202.89534: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 41016 1727204202.89553: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 41016 1727204202.89584: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 41016 1727204202.89624: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "<<< 41016 1727204202.89634: stdout chunk (state=3): >>>x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [<<< 41016 1727204202.89708: stdout chunk (state=3): >>>{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch",<<< 41016 1727204202.89771: stdout chunk (state=3): >>> "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 41016 1727204202.91840: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204202.91872: stderr chunk (state=3): >>>Shared connection to 10.31.14.47 closed. <<< 41016 1727204202.91878: stdout chunk (state=3): >>><<< 41016 1727204202.91880: stderr chunk (state=3): >>><<< 41016 1727204202.92092: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204202.94335: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204202.3118896-42909-27015740680896/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204202.94350: _low_level_execute_command(): starting 41016 1727204202.94358: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204202.3118896-42909-27015740680896/ > /dev/null 2>&1 && sleep 0' 41016 1727204202.94992: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204202.95007: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204202.95057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204202.95070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204202.95154: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204202.95168: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204202.95183: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204202.95208: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204202.95319: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204202.97354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204202.97367: stdout chunk (state=3): >>><<< 41016 1727204202.97379: stderr chunk (state=3): >>><<< 41016 1727204202.97396: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204202.97405: handler run complete 41016 1727204202.98291: variable 'ansible_facts' from source: unknown 41016 1727204202.98771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204203.00883: variable 'ansible_facts' from source: unknown 41016 1727204203.01192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204203.01895: attempt loop complete, returning result 41016 1727204203.01914: _execute() done 41016 1727204203.01921: dumping result to json 41016 1727204203.02134: done dumping result, returning 41016 1727204203.02148: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [028d2410-947f-12d5-0ec4-000000000569] 41016 1727204203.02158: sending task result for task 028d2410-947f-12d5-0ec4-000000000569 41016 1727204203.04371: done sending task result for task 028d2410-947f-12d5-0ec4-000000000569 41016 1727204203.04375: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41016 1727204203.04539: no more pending results, returning what we have 41016 1727204203.04542: results queue empty 41016 1727204203.04543: checking for any_errors_fatal 41016 1727204203.04548: done checking for any_errors_fatal 41016 1727204203.04549: checking for max_fail_percentage 41016 1727204203.04551: done checking for max_fail_percentage 41016 1727204203.04551: checking to see if all hosts have failed and the running result is not ok 41016 1727204203.04552: done checking to see if all hosts have failed 41016 1727204203.04553: getting the remaining hosts for this loop 41016 1727204203.04554: done getting the remaining hosts for this loop 41016 1727204203.04557: getting the next task for host managed-node1 41016 1727204203.04564: done getting next task for host managed-node1 41016 1727204203.04568: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 41016 1727204203.04570: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204203.04582: getting variables 41016 1727204203.04584: in VariableManager get_vars() 41016 1727204203.04618: Calling all_inventory to load vars for managed-node1 41016 1727204203.04621: Calling groups_inventory to load vars for managed-node1 41016 1727204203.04623: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204203.04632: Calling all_plugins_play to load vars for managed-node1 41016 1727204203.04634: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204203.04637: Calling groups_plugins_play to load vars for managed-node1 41016 1727204203.05876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204203.08586: done with get_vars() 41016 1727204203.08625: done getting variables 41016 1727204203.08689: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:56:43 -0400 (0:00:00.819) 0:00:26.763 ***** 41016 1727204203.08733: entering _queue_task() for managed-node1/debug 41016 1727204203.09103: worker is 1 (out of 1 available) 41016 1727204203.09119: exiting _queue_task() for managed-node1/debug 41016 1727204203.09132: done queuing things up, now waiting for results queue to drain 41016 1727204203.09134: waiting for pending results... 41016 1727204203.09606: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider 41016 1727204203.09614: in run() - task 028d2410-947f-12d5-0ec4-00000000006d 41016 1727204203.09618: variable 'ansible_search_path' from source: unknown 41016 1727204203.09879: variable 'ansible_search_path' from source: unknown 41016 1727204203.10134: calling self._execute() 41016 1727204203.10484: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204203.10488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204203.10491: variable 'omit' from source: magic vars 41016 1727204203.11002: variable 'ansible_distribution_major_version' from source: facts 41016 1727204203.11021: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204203.11035: variable 'omit' from source: magic vars 41016 1727204203.11093: variable 'omit' from source: magic vars 41016 1727204203.11204: variable 'network_provider' from source: set_fact 41016 1727204203.11231: variable 'omit' from source: magic vars 41016 1727204203.11282: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204203.11323: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204203.11348: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204203.11373: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204203.11463: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204203.11466: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204203.11469: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204203.11471: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204203.11544: Set connection var ansible_shell_executable to /bin/sh 41016 1727204203.11554: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204203.11563: Set connection var ansible_shell_type to sh 41016 1727204203.11578: Set connection var ansible_timeout to 10 41016 1727204203.11589: Set connection var ansible_pipelining to False 41016 1727204203.11601: Set connection var ansible_connection to ssh 41016 1727204203.11632: variable 'ansible_shell_executable' from source: unknown 41016 1727204203.11641: variable 'ansible_connection' from source: unknown 41016 1727204203.11649: variable 'ansible_module_compression' from source: unknown 41016 1727204203.11656: variable 'ansible_shell_type' from source: unknown 41016 1727204203.11664: variable 'ansible_shell_executable' from source: unknown 41016 1727204203.11671: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204203.11684: variable 'ansible_pipelining' from source: unknown 41016 1727204203.11690: variable 'ansible_timeout' from source: unknown 41016 1727204203.11780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204203.11845: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204203.11862: variable 'omit' from source: magic vars 41016 1727204203.11871: starting attempt loop 41016 1727204203.11880: running the handler 41016 1727204203.11937: handler run complete 41016 1727204203.11956: attempt loop complete, returning result 41016 1727204203.11963: _execute() done 41016 1727204203.11970: dumping result to json 41016 1727204203.11981: done dumping result, returning 41016 1727204203.11992: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider [028d2410-947f-12d5-0ec4-00000000006d] 41016 1727204203.12002: sending task result for task 028d2410-947f-12d5-0ec4-00000000006d ok: [managed-node1] => {} MSG: Using network provider: nm 41016 1727204203.12178: no more pending results, returning what we have 41016 1727204203.12183: results queue empty 41016 1727204203.12184: checking for any_errors_fatal 41016 1727204203.12196: done checking for any_errors_fatal 41016 1727204203.12197: checking for max_fail_percentage 41016 1727204203.12198: done checking for max_fail_percentage 41016 1727204203.12199: checking to see if all hosts have failed and the running result is not ok 41016 1727204203.12200: done checking to see if all hosts have failed 41016 1727204203.12201: getting the remaining hosts for this loop 41016 1727204203.12202: done getting the remaining hosts for this loop 41016 1727204203.12206: getting the next task for host managed-node1 41016 1727204203.12217: done getting next task for host managed-node1 41016 1727204203.12221: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 41016 1727204203.12225: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204203.12237: getting variables 41016 1727204203.12239: in VariableManager get_vars() 41016 1727204203.12385: Calling all_inventory to load vars for managed-node1 41016 1727204203.12389: Calling groups_inventory to load vars for managed-node1 41016 1727204203.12392: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204203.12402: Calling all_plugins_play to load vars for managed-node1 41016 1727204203.12406: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204203.12412: Calling groups_plugins_play to load vars for managed-node1 41016 1727204203.13088: done sending task result for task 028d2410-947f-12d5-0ec4-00000000006d 41016 1727204203.13092: WORKER PROCESS EXITING 41016 1727204203.14033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204203.15681: done with get_vars() 41016 1727204203.15703: done getting variables 41016 1727204203.15761: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:56:43 -0400 (0:00:00.070) 0:00:26.834 ***** 41016 1727204203.15796: entering _queue_task() for managed-node1/fail 41016 1727204203.16107: worker is 1 (out of 1 available) 41016 1727204203.16123: exiting _queue_task() for managed-node1/fail 41016 1727204203.16135: done queuing things up, now waiting for results queue to drain 41016 1727204203.16136: waiting for pending results... 41016 1727204203.16431: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 41016 1727204203.16578: in run() - task 028d2410-947f-12d5-0ec4-00000000006e 41016 1727204203.16602: variable 'ansible_search_path' from source: unknown 41016 1727204203.16613: variable 'ansible_search_path' from source: unknown 41016 1727204203.16651: calling self._execute() 41016 1727204203.16754: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204203.16765: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204203.16781: variable 'omit' from source: magic vars 41016 1727204203.17169: variable 'ansible_distribution_major_version' from source: facts 41016 1727204203.17190: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204203.17319: variable 'network_state' from source: role '' defaults 41016 1727204203.17335: Evaluated conditional (network_state != {}): False 41016 1727204203.17343: when evaluation is False, skipping this task 41016 1727204203.17350: _execute() done 41016 1727204203.17362: dumping result to json 41016 1727204203.17371: done dumping result, returning 41016 1727204203.17383: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [028d2410-947f-12d5-0ec4-00000000006e] 41016 1727204203.17393: sending task result for task 028d2410-947f-12d5-0ec4-00000000006e skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41016 1727204203.17544: no more pending results, returning what we have 41016 1727204203.17549: results queue empty 41016 1727204203.17551: checking for any_errors_fatal 41016 1727204203.17560: done checking for any_errors_fatal 41016 1727204203.17561: checking for max_fail_percentage 41016 1727204203.17563: done checking for max_fail_percentage 41016 1727204203.17564: checking to see if all hosts have failed and the running result is not ok 41016 1727204203.17565: done checking to see if all hosts have failed 41016 1727204203.17566: getting the remaining hosts for this loop 41016 1727204203.17567: done getting the remaining hosts for this loop 41016 1727204203.17571: getting the next task for host managed-node1 41016 1727204203.17581: done getting next task for host managed-node1 41016 1727204203.17585: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 41016 1727204203.17589: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204203.17615: getting variables 41016 1727204203.17617: in VariableManager get_vars() 41016 1727204203.17662: Calling all_inventory to load vars for managed-node1 41016 1727204203.17665: Calling groups_inventory to load vars for managed-node1 41016 1727204203.17668: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204203.17978: Calling all_plugins_play to load vars for managed-node1 41016 1727204203.17982: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204203.17988: done sending task result for task 028d2410-947f-12d5-0ec4-00000000006e 41016 1727204203.17991: WORKER PROCESS EXITING 41016 1727204203.17994: Calling groups_plugins_play to load vars for managed-node1 41016 1727204203.20420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204203.22544: done with get_vars() 41016 1727204203.22579: done getting variables 41016 1727204203.22641: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:56:43 -0400 (0:00:00.068) 0:00:26.902 ***** 41016 1727204203.22677: entering _queue_task() for managed-node1/fail 41016 1727204203.23035: worker is 1 (out of 1 available) 41016 1727204203.23047: exiting _queue_task() for managed-node1/fail 41016 1727204203.23059: done queuing things up, now waiting for results queue to drain 41016 1727204203.23060: waiting for pending results... 41016 1727204203.23362: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 41016 1727204203.23504: in run() - task 028d2410-947f-12d5-0ec4-00000000006f 41016 1727204203.23526: variable 'ansible_search_path' from source: unknown 41016 1727204203.23537: variable 'ansible_search_path' from source: unknown 41016 1727204203.23582: calling self._execute() 41016 1727204203.23688: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204203.23701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204203.23881: variable 'omit' from source: magic vars 41016 1727204203.24134: variable 'ansible_distribution_major_version' from source: facts 41016 1727204203.24150: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204203.24282: variable 'network_state' from source: role '' defaults 41016 1727204203.24300: Evaluated conditional (network_state != {}): False 41016 1727204203.24313: when evaluation is False, skipping this task 41016 1727204203.24327: _execute() done 41016 1727204203.24335: dumping result to json 41016 1727204203.24344: done dumping result, returning 41016 1727204203.24356: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [028d2410-947f-12d5-0ec4-00000000006f] 41016 1727204203.24366: sending task result for task 028d2410-947f-12d5-0ec4-00000000006f 41016 1727204203.24581: done sending task result for task 028d2410-947f-12d5-0ec4-00000000006f 41016 1727204203.24585: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41016 1727204203.24640: no more pending results, returning what we have 41016 1727204203.24645: results queue empty 41016 1727204203.24646: checking for any_errors_fatal 41016 1727204203.24654: done checking for any_errors_fatal 41016 1727204203.24654: checking for max_fail_percentage 41016 1727204203.24657: done checking for max_fail_percentage 41016 1727204203.24658: checking to see if all hosts have failed and the running result is not ok 41016 1727204203.24659: done checking to see if all hosts have failed 41016 1727204203.24660: getting the remaining hosts for this loop 41016 1727204203.24661: done getting the remaining hosts for this loop 41016 1727204203.24665: getting the next task for host managed-node1 41016 1727204203.24673: done getting next task for host managed-node1 41016 1727204203.24680: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 41016 1727204203.24684: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204203.24707: getting variables 41016 1727204203.24712: in VariableManager get_vars() 41016 1727204203.24755: Calling all_inventory to load vars for managed-node1 41016 1727204203.24758: Calling groups_inventory to load vars for managed-node1 41016 1727204203.24761: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204203.24772: Calling all_plugins_play to load vars for managed-node1 41016 1727204203.24978: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204203.24984: Calling groups_plugins_play to load vars for managed-node1 41016 1727204203.26441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204203.27952: done with get_vars() 41016 1727204203.27980: done getting variables 41016 1727204203.28042: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:56:43 -0400 (0:00:00.053) 0:00:26.956 ***** 41016 1727204203.28077: entering _queue_task() for managed-node1/fail 41016 1727204203.28397: worker is 1 (out of 1 available) 41016 1727204203.28414: exiting _queue_task() for managed-node1/fail 41016 1727204203.28425: done queuing things up, now waiting for results queue to drain 41016 1727204203.28427: waiting for pending results... 41016 1727204203.28939: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 41016 1727204203.29282: in run() - task 028d2410-947f-12d5-0ec4-000000000070 41016 1727204203.29480: variable 'ansible_search_path' from source: unknown 41016 1727204203.29484: variable 'ansible_search_path' from source: unknown 41016 1727204203.29488: calling self._execute() 41016 1727204203.29600: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204203.29690: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204203.29704: variable 'omit' from source: magic vars 41016 1727204203.30457: variable 'ansible_distribution_major_version' from source: facts 41016 1727204203.30524: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204203.31051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41016 1727204203.34421: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41016 1727204203.34490: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41016 1727204203.34540: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41016 1727204203.34580: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41016 1727204203.34616: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41016 1727204203.34700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204203.34744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204203.34775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204203.34825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204203.34849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204203.34939: variable 'ansible_distribution_major_version' from source: facts 41016 1727204203.34962: Evaluated conditional (ansible_distribution_major_version | int > 9): True 41016 1727204203.35088: variable 'ansible_distribution' from source: facts 41016 1727204203.35099: variable '__network_rh_distros' from source: role '' defaults 41016 1727204203.35116: Evaluated conditional (ansible_distribution in __network_rh_distros): True 41016 1727204203.35513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204203.35680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204203.35684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204203.35721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204203.35740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204203.35791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204203.36033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204203.36036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204203.36038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204203.36040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204203.36118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204203.36357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204203.36360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204203.36363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204203.36365: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204203.37088: variable 'network_connections' from source: task vars 41016 1727204203.37106: variable 'interface1' from source: play vars 41016 1727204203.37286: variable 'interface1' from source: play vars 41016 1727204203.37360: variable 'interface1_mac' from source: set_fact 41016 1727204203.37461: variable 'network_state' from source: role '' defaults 41016 1727204203.37550: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41016 1727204203.37879: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41016 1727204203.38024: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41016 1727204203.38126: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41016 1727204203.38162: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41016 1727204203.38312: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41016 1727204203.38351: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41016 1727204203.38527: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204203.38531: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41016 1727204203.38534: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 41016 1727204203.38537: when evaluation is False, skipping this task 41016 1727204203.38539: _execute() done 41016 1727204203.38542: dumping result to json 41016 1727204203.38643: done dumping result, returning 41016 1727204203.38657: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [028d2410-947f-12d5-0ec4-000000000070] 41016 1727204203.38668: sending task result for task 028d2410-947f-12d5-0ec4-000000000070 skipping: [managed-node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 41016 1727204203.38963: no more pending results, returning what we have 41016 1727204203.38968: results queue empty 41016 1727204203.38969: checking for any_errors_fatal 41016 1727204203.38979: done checking for any_errors_fatal 41016 1727204203.38980: checking for max_fail_percentage 41016 1727204203.38983: done checking for max_fail_percentage 41016 1727204203.38984: checking to see if all hosts have failed and the running result is not ok 41016 1727204203.38985: done checking to see if all hosts have failed 41016 1727204203.38985: getting the remaining hosts for this loop 41016 1727204203.38987: done getting the remaining hosts for this loop 41016 1727204203.38992: getting the next task for host managed-node1 41016 1727204203.39001: done getting next task for host managed-node1 41016 1727204203.39005: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 41016 1727204203.39008: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204203.39032: getting variables 41016 1727204203.39034: in VariableManager get_vars() 41016 1727204203.39383: Calling all_inventory to load vars for managed-node1 41016 1727204203.39387: Calling groups_inventory to load vars for managed-node1 41016 1727204203.39390: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204203.39401: Calling all_plugins_play to load vars for managed-node1 41016 1727204203.39404: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204203.39407: Calling groups_plugins_play to load vars for managed-node1 41016 1727204203.40292: done sending task result for task 028d2410-947f-12d5-0ec4-000000000070 41016 1727204203.40295: WORKER PROCESS EXITING 41016 1727204203.42294: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204203.45534: done with get_vars() 41016 1727204203.45567: done getting variables 41016 1727204203.45635: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:56:43 -0400 (0:00:00.175) 0:00:27.132 ***** 41016 1727204203.45670: entering _queue_task() for managed-node1/dnf 41016 1727204203.46418: worker is 1 (out of 1 available) 41016 1727204203.46433: exiting _queue_task() for managed-node1/dnf 41016 1727204203.46446: done queuing things up, now waiting for results queue to drain 41016 1727204203.46447: waiting for pending results... 41016 1727204203.47032: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 41016 1727204203.47264: in run() - task 028d2410-947f-12d5-0ec4-000000000071 41016 1727204203.47290: variable 'ansible_search_path' from source: unknown 41016 1727204203.47368: variable 'ansible_search_path' from source: unknown 41016 1727204203.47484: calling self._execute() 41016 1727204203.47871: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204203.48089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204203.48280: variable 'omit' from source: magic vars 41016 1727204203.48677: variable 'ansible_distribution_major_version' from source: facts 41016 1727204203.49080: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204203.49283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41016 1727204203.53570: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41016 1727204203.53645: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41016 1727204203.53689: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41016 1727204203.53734: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41016 1727204203.53765: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41016 1727204203.53859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204203.53899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204203.53939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204203.53988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204203.54009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204203.54137: variable 'ansible_distribution' from source: facts 41016 1727204203.54152: variable 'ansible_distribution_major_version' from source: facts 41016 1727204203.54172: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 41016 1727204203.54300: variable '__network_wireless_connections_defined' from source: role '' defaults 41016 1727204203.54446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204203.54488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204203.54519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204203.54566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204203.54592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204203.54638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204203.54668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204203.54709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204203.54752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204203.54772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204203.54826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204203.54854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204203.54891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204203.54939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204203.54959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204203.55132: variable 'network_connections' from source: task vars 41016 1727204203.55149: variable 'interface1' from source: play vars 41016 1727204203.55227: variable 'interface1' from source: play vars 41016 1727204203.55305: variable 'interface1_mac' from source: set_fact 41016 1727204203.55397: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41016 1727204203.62650: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41016 1727204203.62704: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41016 1727204203.62739: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41016 1727204203.62778: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41016 1727204203.62829: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41016 1727204203.62860: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41016 1727204203.62903: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204203.62933: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41016 1727204203.63083: variable '__network_team_connections_defined' from source: role '' defaults 41016 1727204203.63261: variable 'network_connections' from source: task vars 41016 1727204203.63271: variable 'interface1' from source: play vars 41016 1727204203.63346: variable 'interface1' from source: play vars 41016 1727204203.63429: variable 'interface1_mac' from source: set_fact 41016 1727204203.63469: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41016 1727204203.63479: when evaluation is False, skipping this task 41016 1727204203.63515: _execute() done 41016 1727204203.63522: dumping result to json 41016 1727204203.63524: done dumping result, returning 41016 1727204203.63527: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [028d2410-947f-12d5-0ec4-000000000071] 41016 1727204203.63529: sending task result for task 028d2410-947f-12d5-0ec4-000000000071 skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41016 1727204203.63670: no more pending results, returning what we have 41016 1727204203.63674: results queue empty 41016 1727204203.63675: checking for any_errors_fatal 41016 1727204203.63682: done checking for any_errors_fatal 41016 1727204203.63683: checking for max_fail_percentage 41016 1727204203.63684: done checking for max_fail_percentage 41016 1727204203.63685: checking to see if all hosts have failed and the running result is not ok 41016 1727204203.63686: done checking to see if all hosts have failed 41016 1727204203.63687: getting the remaining hosts for this loop 41016 1727204203.63689: done getting the remaining hosts for this loop 41016 1727204203.63692: getting the next task for host managed-node1 41016 1727204203.63700: done getting next task for host managed-node1 41016 1727204203.63703: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 41016 1727204203.63706: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204203.63724: getting variables 41016 1727204203.63836: in VariableManager get_vars() 41016 1727204203.63883: Calling all_inventory to load vars for managed-node1 41016 1727204203.63886: Calling groups_inventory to load vars for managed-node1 41016 1727204203.63889: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204203.63899: Calling all_plugins_play to load vars for managed-node1 41016 1727204203.63904: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204203.63908: Calling groups_plugins_play to load vars for managed-node1 41016 1727204203.64492: done sending task result for task 028d2410-947f-12d5-0ec4-000000000071 41016 1727204203.64496: WORKER PROCESS EXITING 41016 1727204203.73025: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204203.74930: done with get_vars() 41016 1727204203.74956: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 41016 1727204203.75028: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:56:43 -0400 (0:00:00.293) 0:00:27.426 ***** 41016 1727204203.75068: entering _queue_task() for managed-node1/yum 41016 1727204203.75540: worker is 1 (out of 1 available) 41016 1727204203.75550: exiting _queue_task() for managed-node1/yum 41016 1727204203.75558: done queuing things up, now waiting for results queue to drain 41016 1727204203.75560: waiting for pending results... 41016 1727204203.75841: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 41016 1727204203.76020: in run() - task 028d2410-947f-12d5-0ec4-000000000072 41016 1727204203.76043: variable 'ansible_search_path' from source: unknown 41016 1727204203.76052: variable 'ansible_search_path' from source: unknown 41016 1727204203.76103: calling self._execute() 41016 1727204203.76208: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204203.76222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204203.76237: variable 'omit' from source: magic vars 41016 1727204203.76649: variable 'ansible_distribution_major_version' from source: facts 41016 1727204203.76665: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204203.76854: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41016 1727204203.79071: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41016 1727204203.79381: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41016 1727204203.79385: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41016 1727204203.79388: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41016 1727204203.79391: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41016 1727204203.79394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204203.79397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204203.79413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204203.79457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204203.79481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204203.79584: variable 'ansible_distribution_major_version' from source: facts 41016 1727204203.79605: Evaluated conditional (ansible_distribution_major_version | int < 8): False 41016 1727204203.79616: when evaluation is False, skipping this task 41016 1727204203.79627: _execute() done 41016 1727204203.79733: dumping result to json 41016 1727204203.79736: done dumping result, returning 41016 1727204203.79739: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [028d2410-947f-12d5-0ec4-000000000072] 41016 1727204203.79742: sending task result for task 028d2410-947f-12d5-0ec4-000000000072 41016 1727204203.79819: done sending task result for task 028d2410-947f-12d5-0ec4-000000000072 41016 1727204203.79823: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 41016 1727204203.79879: no more pending results, returning what we have 41016 1727204203.79883: results queue empty 41016 1727204203.79884: checking for any_errors_fatal 41016 1727204203.79894: done checking for any_errors_fatal 41016 1727204203.79894: checking for max_fail_percentage 41016 1727204203.79896: done checking for max_fail_percentage 41016 1727204203.79897: checking to see if all hosts have failed and the running result is not ok 41016 1727204203.79897: done checking to see if all hosts have failed 41016 1727204203.79898: getting the remaining hosts for this loop 41016 1727204203.79899: done getting the remaining hosts for this loop 41016 1727204203.79904: getting the next task for host managed-node1 41016 1727204203.79913: done getting next task for host managed-node1 41016 1727204203.79917: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 41016 1727204203.79919: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204203.79937: getting variables 41016 1727204203.79939: in VariableManager get_vars() 41016 1727204203.79982: Calling all_inventory to load vars for managed-node1 41016 1727204203.79985: Calling groups_inventory to load vars for managed-node1 41016 1727204203.79988: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204203.79997: Calling all_plugins_play to load vars for managed-node1 41016 1727204203.79999: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204203.80002: Calling groups_plugins_play to load vars for managed-node1 41016 1727204203.81580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204203.83199: done with get_vars() 41016 1727204203.83229: done getting variables 41016 1727204203.83294: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:56:43 -0400 (0:00:00.082) 0:00:27.509 ***** 41016 1727204203.83335: entering _queue_task() for managed-node1/fail 41016 1727204203.83715: worker is 1 (out of 1 available) 41016 1727204203.83728: exiting _queue_task() for managed-node1/fail 41016 1727204203.83740: done queuing things up, now waiting for results queue to drain 41016 1727204203.83741: waiting for pending results... 41016 1727204203.84497: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 41016 1727204203.84664: in run() - task 028d2410-947f-12d5-0ec4-000000000073 41016 1727204203.84701: variable 'ansible_search_path' from source: unknown 41016 1727204203.84706: variable 'ansible_search_path' from source: unknown 41016 1727204203.84786: calling self._execute() 41016 1727204203.84823: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204203.84827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204203.84839: variable 'omit' from source: magic vars 41016 1727204203.85244: variable 'ansible_distribution_major_version' from source: facts 41016 1727204203.85254: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204203.85384: variable '__network_wireless_connections_defined' from source: role '' defaults 41016 1727204203.85682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41016 1727204203.90003: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41016 1727204203.90071: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41016 1727204203.90126: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41016 1727204203.90165: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41016 1727204203.90197: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41016 1727204203.90285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204203.90319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204203.90347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204203.90392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204203.90406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204203.90460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204203.90488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204203.90512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204203.90550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204203.90577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204203.90695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204203.90698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204203.90709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204203.90755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204203.90769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204203.91007: variable 'network_connections' from source: task vars 41016 1727204203.91011: variable 'interface1' from source: play vars 41016 1727204203.91076: variable 'interface1' from source: play vars 41016 1727204203.91292: variable 'interface1_mac' from source: set_fact 41016 1727204203.91295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41016 1727204203.91410: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41016 1727204203.91451: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41016 1727204203.91491: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41016 1727204203.91521: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41016 1727204203.91565: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41016 1727204203.91588: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41016 1727204203.91618: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204203.91640: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41016 1727204203.91780: variable '__network_team_connections_defined' from source: role '' defaults 41016 1727204203.91929: variable 'network_connections' from source: task vars 41016 1727204203.91932: variable 'interface1' from source: play vars 41016 1727204203.91997: variable 'interface1' from source: play vars 41016 1727204203.92064: variable 'interface1_mac' from source: set_fact 41016 1727204203.92106: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41016 1727204203.92109: when evaluation is False, skipping this task 41016 1727204203.92112: _execute() done 41016 1727204203.92117: dumping result to json 41016 1727204203.92121: done dumping result, returning 41016 1727204203.92129: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [028d2410-947f-12d5-0ec4-000000000073] 41016 1727204203.92140: sending task result for task 028d2410-947f-12d5-0ec4-000000000073 41016 1727204203.92226: done sending task result for task 028d2410-947f-12d5-0ec4-000000000073 41016 1727204203.92229: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41016 1727204203.92315: no more pending results, returning what we have 41016 1727204203.92320: results queue empty 41016 1727204203.92321: checking for any_errors_fatal 41016 1727204203.92328: done checking for any_errors_fatal 41016 1727204203.92328: checking for max_fail_percentage 41016 1727204203.92330: done checking for max_fail_percentage 41016 1727204203.92331: checking to see if all hosts have failed and the running result is not ok 41016 1727204203.92332: done checking to see if all hosts have failed 41016 1727204203.92332: getting the remaining hosts for this loop 41016 1727204203.92334: done getting the remaining hosts for this loop 41016 1727204203.92337: getting the next task for host managed-node1 41016 1727204203.92345: done getting next task for host managed-node1 41016 1727204203.92349: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 41016 1727204203.92352: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204203.92368: getting variables 41016 1727204203.92370: in VariableManager get_vars() 41016 1727204203.92583: Calling all_inventory to load vars for managed-node1 41016 1727204203.92586: Calling groups_inventory to load vars for managed-node1 41016 1727204203.92588: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204203.92597: Calling all_plugins_play to load vars for managed-node1 41016 1727204203.92600: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204203.92602: Calling groups_plugins_play to load vars for managed-node1 41016 1727204203.94131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204203.96223: done with get_vars() 41016 1727204203.96248: done getting variables 41016 1727204203.96309: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:56:43 -0400 (0:00:00.130) 0:00:27.639 ***** 41016 1727204203.96345: entering _queue_task() for managed-node1/package 41016 1727204203.96681: worker is 1 (out of 1 available) 41016 1727204203.96693: exiting _queue_task() for managed-node1/package 41016 1727204203.96707: done queuing things up, now waiting for results queue to drain 41016 1727204203.96708: waiting for pending results... 41016 1727204203.97096: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages 41016 1727204203.97149: in run() - task 028d2410-947f-12d5-0ec4-000000000074 41016 1727204203.97166: variable 'ansible_search_path' from source: unknown 41016 1727204203.97173: variable 'ansible_search_path' from source: unknown 41016 1727204203.97217: calling self._execute() 41016 1727204203.97313: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204203.97324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204203.97338: variable 'omit' from source: magic vars 41016 1727204203.97878: variable 'ansible_distribution_major_version' from source: facts 41016 1727204203.97896: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204203.98304: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41016 1727204203.99072: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41016 1727204203.99078: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41016 1727204203.99081: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41016 1727204203.99135: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41016 1727204203.99386: variable 'network_packages' from source: role '' defaults 41016 1727204203.99581: variable '__network_provider_setup' from source: role '' defaults 41016 1727204203.99622: variable '__network_service_name_default_nm' from source: role '' defaults 41016 1727204203.99848: variable '__network_service_name_default_nm' from source: role '' defaults 41016 1727204203.99862: variable '__network_packages_default_nm' from source: role '' defaults 41016 1727204203.99927: variable '__network_packages_default_nm' from source: role '' defaults 41016 1727204204.00227: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41016 1727204204.04241: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41016 1727204204.04345: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41016 1727204204.04620: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41016 1727204204.04880: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41016 1727204204.04884: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41016 1727204204.04887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204204.04890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204204.04893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204204.05084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204204.05106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204204.05159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204204.05222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204204.05295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204204.05430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204204.05488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204204.05939: variable '__network_packages_default_gobject_packages' from source: role '' defaults 41016 1727204204.06115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204204.06144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204204.06172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204204.06222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204204.06242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204204.06353: variable 'ansible_python' from source: facts 41016 1727204204.06385: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 41016 1727204204.06482: variable '__network_wpa_supplicant_required' from source: role '' defaults 41016 1727204204.06642: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41016 1727204204.06714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204204.06742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204204.06780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204204.06822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204204.06841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204204.06905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204204.06949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204204.06991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204204.07040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204204.07081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204204.07235: variable 'network_connections' from source: task vars 41016 1727204204.07247: variable 'interface1' from source: play vars 41016 1727204204.07385: variable 'interface1' from source: play vars 41016 1727204204.07495: variable 'interface1_mac' from source: set_fact 41016 1727204204.07608: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41016 1727204204.07663: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41016 1727204204.07733: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204204.07737: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41016 1727204204.07802: variable '__network_wireless_connections_defined' from source: role '' defaults 41016 1727204204.08119: variable 'network_connections' from source: task vars 41016 1727204204.08129: variable 'interface1' from source: play vars 41016 1727204204.08245: variable 'interface1' from source: play vars 41016 1727204204.08365: variable 'interface1_mac' from source: set_fact 41016 1727204204.08449: variable '__network_packages_default_wireless' from source: role '' defaults 41016 1727204204.08602: variable '__network_wireless_connections_defined' from source: role '' defaults 41016 1727204204.08985: variable 'network_connections' from source: task vars 41016 1727204204.08988: variable 'interface1' from source: play vars 41016 1727204204.09052: variable 'interface1' from source: play vars 41016 1727204204.09215: variable 'interface1_mac' from source: set_fact 41016 1727204204.09319: variable '__network_packages_default_team' from source: role '' defaults 41016 1727204204.09580: variable '__network_team_connections_defined' from source: role '' defaults 41016 1727204204.09909: variable 'network_connections' from source: task vars 41016 1727204204.09920: variable 'interface1' from source: play vars 41016 1727204204.09994: variable 'interface1' from source: play vars 41016 1727204204.10081: variable 'interface1_mac' from source: set_fact 41016 1727204204.10154: variable '__network_service_name_default_initscripts' from source: role '' defaults 41016 1727204204.10220: variable '__network_service_name_default_initscripts' from source: role '' defaults 41016 1727204204.10235: variable '__network_packages_default_initscripts' from source: role '' defaults 41016 1727204204.10311: variable '__network_packages_default_initscripts' from source: role '' defaults 41016 1727204204.10552: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 41016 1727204204.11083: variable 'network_connections' from source: task vars 41016 1727204204.11112: variable 'interface1' from source: play vars 41016 1727204204.11174: variable 'interface1' from source: play vars 41016 1727204204.11265: variable 'interface1_mac' from source: set_fact 41016 1727204204.11294: variable 'ansible_distribution' from source: facts 41016 1727204204.11302: variable '__network_rh_distros' from source: role '' defaults 41016 1727204204.11313: variable 'ansible_distribution_major_version' from source: facts 41016 1727204204.11342: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 41016 1727204204.11514: variable 'ansible_distribution' from source: facts 41016 1727204204.11524: variable '__network_rh_distros' from source: role '' defaults 41016 1727204204.11561: variable 'ansible_distribution_major_version' from source: facts 41016 1727204204.11565: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 41016 1727204204.11775: variable 'ansible_distribution' from source: facts 41016 1727204204.11788: variable '__network_rh_distros' from source: role '' defaults 41016 1727204204.11880: variable 'ansible_distribution_major_version' from source: facts 41016 1727204204.11883: variable 'network_provider' from source: set_fact 41016 1727204204.11887: variable 'ansible_facts' from source: unknown 41016 1727204204.13417: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 41016 1727204204.13421: when evaluation is False, skipping this task 41016 1727204204.13423: _execute() done 41016 1727204204.13426: dumping result to json 41016 1727204204.13428: done dumping result, returning 41016 1727204204.13430: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages [028d2410-947f-12d5-0ec4-000000000074] 41016 1727204204.13432: sending task result for task 028d2410-947f-12d5-0ec4-000000000074 41016 1727204204.13555: done sending task result for task 028d2410-947f-12d5-0ec4-000000000074 41016 1727204204.13559: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 41016 1727204204.13615: no more pending results, returning what we have 41016 1727204204.13620: results queue empty 41016 1727204204.13621: checking for any_errors_fatal 41016 1727204204.13628: done checking for any_errors_fatal 41016 1727204204.13629: checking for max_fail_percentage 41016 1727204204.13631: done checking for max_fail_percentage 41016 1727204204.13632: checking to see if all hosts have failed and the running result is not ok 41016 1727204204.13633: done checking to see if all hosts have failed 41016 1727204204.13634: getting the remaining hosts for this loop 41016 1727204204.13635: done getting the remaining hosts for this loop 41016 1727204204.13640: getting the next task for host managed-node1 41016 1727204204.13648: done getting next task for host managed-node1 41016 1727204204.13652: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 41016 1727204204.13655: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204204.13673: getting variables 41016 1727204204.13674: in VariableManager get_vars() 41016 1727204204.14026: Calling all_inventory to load vars for managed-node1 41016 1727204204.14029: Calling groups_inventory to load vars for managed-node1 41016 1727204204.14032: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204204.14045: Calling all_plugins_play to load vars for managed-node1 41016 1727204204.14048: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204204.14050: Calling groups_plugins_play to load vars for managed-node1 41016 1727204204.15721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204204.17573: done with get_vars() 41016 1727204204.17594: done getting variables 41016 1727204204.17653: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:56:44 -0400 (0:00:00.213) 0:00:27.852 ***** 41016 1727204204.17688: entering _queue_task() for managed-node1/package 41016 1727204204.18414: worker is 1 (out of 1 available) 41016 1727204204.18424: exiting _queue_task() for managed-node1/package 41016 1727204204.18434: done queuing things up, now waiting for results queue to drain 41016 1727204204.18435: waiting for pending results... 41016 1727204204.18869: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 41016 1727204204.19100: in run() - task 028d2410-947f-12d5-0ec4-000000000075 41016 1727204204.19143: variable 'ansible_search_path' from source: unknown 41016 1727204204.19149: variable 'ansible_search_path' from source: unknown 41016 1727204204.19296: calling self._execute() 41016 1727204204.19415: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204204.19419: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204204.19428: variable 'omit' from source: magic vars 41016 1727204204.19928: variable 'ansible_distribution_major_version' from source: facts 41016 1727204204.19942: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204204.20164: variable 'network_state' from source: role '' defaults 41016 1727204204.20167: Evaluated conditional (network_state != {}): False 41016 1727204204.20170: when evaluation is False, skipping this task 41016 1727204204.20171: _execute() done 41016 1727204204.20173: dumping result to json 41016 1727204204.20177: done dumping result, returning 41016 1727204204.20180: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [028d2410-947f-12d5-0ec4-000000000075] 41016 1727204204.20183: sending task result for task 028d2410-947f-12d5-0ec4-000000000075 41016 1727204204.20255: done sending task result for task 028d2410-947f-12d5-0ec4-000000000075 41016 1727204204.20259: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41016 1727204204.20315: no more pending results, returning what we have 41016 1727204204.20320: results queue empty 41016 1727204204.20322: checking for any_errors_fatal 41016 1727204204.20330: done checking for any_errors_fatal 41016 1727204204.20331: checking for max_fail_percentage 41016 1727204204.20333: done checking for max_fail_percentage 41016 1727204204.20334: checking to see if all hosts have failed and the running result is not ok 41016 1727204204.20334: done checking to see if all hosts have failed 41016 1727204204.20335: getting the remaining hosts for this loop 41016 1727204204.20337: done getting the remaining hosts for this loop 41016 1727204204.20341: getting the next task for host managed-node1 41016 1727204204.20350: done getting next task for host managed-node1 41016 1727204204.20354: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 41016 1727204204.20358: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204204.20384: getting variables 41016 1727204204.20387: in VariableManager get_vars() 41016 1727204204.20687: Calling all_inventory to load vars for managed-node1 41016 1727204204.20690: Calling groups_inventory to load vars for managed-node1 41016 1727204204.20693: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204204.20703: Calling all_plugins_play to load vars for managed-node1 41016 1727204204.20707: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204204.20710: Calling groups_plugins_play to load vars for managed-node1 41016 1727204204.22739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204204.24518: done with get_vars() 41016 1727204204.24544: done getting variables 41016 1727204204.24614: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:56:44 -0400 (0:00:00.069) 0:00:27.922 ***** 41016 1727204204.24662: entering _queue_task() for managed-node1/package 41016 1727204204.25070: worker is 1 (out of 1 available) 41016 1727204204.25093: exiting _queue_task() for managed-node1/package 41016 1727204204.25107: done queuing things up, now waiting for results queue to drain 41016 1727204204.25108: waiting for pending results... 41016 1727204204.25497: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 41016 1727204204.25638: in run() - task 028d2410-947f-12d5-0ec4-000000000076 41016 1727204204.25651: variable 'ansible_search_path' from source: unknown 41016 1727204204.25654: variable 'ansible_search_path' from source: unknown 41016 1727204204.25699: calling self._execute() 41016 1727204204.25872: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204204.25916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204204.25920: variable 'omit' from source: magic vars 41016 1727204204.26340: variable 'ansible_distribution_major_version' from source: facts 41016 1727204204.26352: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204204.26478: variable 'network_state' from source: role '' defaults 41016 1727204204.26488: Evaluated conditional (network_state != {}): False 41016 1727204204.26492: when evaluation is False, skipping this task 41016 1727204204.26495: _execute() done 41016 1727204204.26498: dumping result to json 41016 1727204204.26501: done dumping result, returning 41016 1727204204.26566: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [028d2410-947f-12d5-0ec4-000000000076] 41016 1727204204.26569: sending task result for task 028d2410-947f-12d5-0ec4-000000000076 41016 1727204204.26634: done sending task result for task 028d2410-947f-12d5-0ec4-000000000076 41016 1727204204.26637: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41016 1727204204.26701: no more pending results, returning what we have 41016 1727204204.26706: results queue empty 41016 1727204204.26707: checking for any_errors_fatal 41016 1727204204.26717: done checking for any_errors_fatal 41016 1727204204.26718: checking for max_fail_percentage 41016 1727204204.26719: done checking for max_fail_percentage 41016 1727204204.26720: checking to see if all hosts have failed and the running result is not ok 41016 1727204204.26721: done checking to see if all hosts have failed 41016 1727204204.26722: getting the remaining hosts for this loop 41016 1727204204.26723: done getting the remaining hosts for this loop 41016 1727204204.26727: getting the next task for host managed-node1 41016 1727204204.26735: done getting next task for host managed-node1 41016 1727204204.26738: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 41016 1727204204.26742: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204204.26763: getting variables 41016 1727204204.26765: in VariableManager get_vars() 41016 1727204204.26816: Calling all_inventory to load vars for managed-node1 41016 1727204204.26819: Calling groups_inventory to load vars for managed-node1 41016 1727204204.26822: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204204.26835: Calling all_plugins_play to load vars for managed-node1 41016 1727204204.26838: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204204.26841: Calling groups_plugins_play to load vars for managed-node1 41016 1727204204.27943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204204.28825: done with get_vars() 41016 1727204204.28857: done getting variables 41016 1727204204.28929: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:56:44 -0400 (0:00:00.043) 0:00:27.965 ***** 41016 1727204204.28965: entering _queue_task() for managed-node1/service 41016 1727204204.29336: worker is 1 (out of 1 available) 41016 1727204204.29350: exiting _queue_task() for managed-node1/service 41016 1727204204.29361: done queuing things up, now waiting for results queue to drain 41016 1727204204.29363: waiting for pending results... 41016 1727204204.29705: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 41016 1727204204.29782: in run() - task 028d2410-947f-12d5-0ec4-000000000077 41016 1727204204.29786: variable 'ansible_search_path' from source: unknown 41016 1727204204.29789: variable 'ansible_search_path' from source: unknown 41016 1727204204.29853: calling self._execute() 41016 1727204204.29955: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204204.29959: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204204.29967: variable 'omit' from source: magic vars 41016 1727204204.30257: variable 'ansible_distribution_major_version' from source: facts 41016 1727204204.30265: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204204.30352: variable '__network_wireless_connections_defined' from source: role '' defaults 41016 1727204204.30481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41016 1727204204.32007: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41016 1727204204.32090: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41016 1727204204.32096: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41016 1727204204.32128: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41016 1727204204.32155: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41016 1727204204.32233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204204.32260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204204.32284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204204.32324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204204.32338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204204.32428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204204.32431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204204.32434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204204.32470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204204.32505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204204.32554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204204.32557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204204.32567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204204.32679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204204.32682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204204.32887: variable 'network_connections' from source: task vars 41016 1727204204.32891: variable 'interface1' from source: play vars 41016 1727204204.32893: variable 'interface1' from source: play vars 41016 1727204204.32955: variable 'interface1_mac' from source: set_fact 41016 1727204204.33033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41016 1727204204.33204: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41016 1727204204.33241: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41016 1727204204.33325: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41016 1727204204.33328: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41016 1727204204.33350: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41016 1727204204.33362: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41016 1727204204.33388: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204204.33416: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41016 1727204204.33478: variable '__network_team_connections_defined' from source: role '' defaults 41016 1727204204.33663: variable 'network_connections' from source: task vars 41016 1727204204.33667: variable 'interface1' from source: play vars 41016 1727204204.33710: variable 'interface1' from source: play vars 41016 1727204204.33773: variable 'interface1_mac' from source: set_fact 41016 1727204204.33803: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41016 1727204204.33806: when evaluation is False, skipping this task 41016 1727204204.33809: _execute() done 41016 1727204204.33811: dumping result to json 41016 1727204204.33817: done dumping result, returning 41016 1727204204.33823: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [028d2410-947f-12d5-0ec4-000000000077] 41016 1727204204.33834: sending task result for task 028d2410-947f-12d5-0ec4-000000000077 skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41016 1727204204.33958: no more pending results, returning what we have 41016 1727204204.33961: results queue empty 41016 1727204204.33962: checking for any_errors_fatal 41016 1727204204.33972: done checking for any_errors_fatal 41016 1727204204.33973: checking for max_fail_percentage 41016 1727204204.33974: done checking for max_fail_percentage 41016 1727204204.33982: checking to see if all hosts have failed and the running result is not ok 41016 1727204204.33984: done checking to see if all hosts have failed 41016 1727204204.33984: getting the remaining hosts for this loop 41016 1727204204.33986: done getting the remaining hosts for this loop 41016 1727204204.33989: getting the next task for host managed-node1 41016 1727204204.33996: done getting next task for host managed-node1 41016 1727204204.34000: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 41016 1727204204.34002: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204204.34013: done sending task result for task 028d2410-947f-12d5-0ec4-000000000077 41016 1727204204.34016: WORKER PROCESS EXITING 41016 1727204204.34028: getting variables 41016 1727204204.34030: in VariableManager get_vars() 41016 1727204204.34070: Calling all_inventory to load vars for managed-node1 41016 1727204204.34072: Calling groups_inventory to load vars for managed-node1 41016 1727204204.34077: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204204.34093: Calling all_plugins_play to load vars for managed-node1 41016 1727204204.34096: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204204.34099: Calling groups_plugins_play to load vars for managed-node1 41016 1727204204.34885: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204204.36145: done with get_vars() 41016 1727204204.36161: done getting variables 41016 1727204204.36209: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:56:44 -0400 (0:00:00.072) 0:00:28.038 ***** 41016 1727204204.36234: entering _queue_task() for managed-node1/service 41016 1727204204.36525: worker is 1 (out of 1 available) 41016 1727204204.36537: exiting _queue_task() for managed-node1/service 41016 1727204204.36551: done queuing things up, now waiting for results queue to drain 41016 1727204204.36552: waiting for pending results... 41016 1727204204.36852: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 41016 1727204204.37010: in run() - task 028d2410-947f-12d5-0ec4-000000000078 41016 1727204204.37033: variable 'ansible_search_path' from source: unknown 41016 1727204204.37040: variable 'ansible_search_path' from source: unknown 41016 1727204204.37082: calling self._execute() 41016 1727204204.37217: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204204.37220: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204204.37223: variable 'omit' from source: magic vars 41016 1727204204.37587: variable 'ansible_distribution_major_version' from source: facts 41016 1727204204.37596: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204204.37714: variable 'network_provider' from source: set_fact 41016 1727204204.37718: variable 'network_state' from source: role '' defaults 41016 1727204204.37725: Evaluated conditional (network_provider == "nm" or network_state != {}): True 41016 1727204204.37731: variable 'omit' from source: magic vars 41016 1727204204.37766: variable 'omit' from source: magic vars 41016 1727204204.37789: variable 'network_service_name' from source: role '' defaults 41016 1727204204.37841: variable 'network_service_name' from source: role '' defaults 41016 1727204204.37935: variable '__network_provider_setup' from source: role '' defaults 41016 1727204204.37939: variable '__network_service_name_default_nm' from source: role '' defaults 41016 1727204204.37985: variable '__network_service_name_default_nm' from source: role '' defaults 41016 1727204204.37992: variable '__network_packages_default_nm' from source: role '' defaults 41016 1727204204.38038: variable '__network_packages_default_nm' from source: role '' defaults 41016 1727204204.38181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41016 1727204204.40722: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41016 1727204204.40790: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41016 1727204204.40843: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41016 1727204204.40884: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41016 1727204204.40903: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41016 1727204204.40961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204204.40991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204204.41015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204204.41045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204204.41051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204204.41087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204204.41103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204204.41121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204204.41145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204204.41156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204204.41301: variable '__network_packages_default_gobject_packages' from source: role '' defaults 41016 1727204204.41379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204204.41399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204204.41415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204204.41439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204204.41449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204204.41515: variable 'ansible_python' from source: facts 41016 1727204204.41532: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 41016 1727204204.41591: variable '__network_wpa_supplicant_required' from source: role '' defaults 41016 1727204204.41645: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41016 1727204204.41730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204204.41747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204204.41764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204204.41790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204204.41802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204204.41837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204204.41856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204204.41872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204204.41898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204204.41913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204204.42003: variable 'network_connections' from source: task vars 41016 1727204204.42011: variable 'interface1' from source: play vars 41016 1727204204.42064: variable 'interface1' from source: play vars 41016 1727204204.42128: variable 'interface1_mac' from source: set_fact 41016 1727204204.42212: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41016 1727204204.42341: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41016 1727204204.42379: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41016 1727204204.42411: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41016 1727204204.42439: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41016 1727204204.42488: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41016 1727204204.42509: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41016 1727204204.42532: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204204.42554: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41016 1727204204.42592: variable '__network_wireless_connections_defined' from source: role '' defaults 41016 1727204204.42909: variable 'network_connections' from source: task vars 41016 1727204204.42915: variable 'interface1' from source: play vars 41016 1727204204.42917: variable 'interface1' from source: play vars 41016 1727204204.43021: variable 'interface1_mac' from source: set_fact 41016 1727204204.43281: variable '__network_packages_default_wireless' from source: role '' defaults 41016 1727204204.43284: variable '__network_wireless_connections_defined' from source: role '' defaults 41016 1727204204.43487: variable 'network_connections' from source: task vars 41016 1727204204.43491: variable 'interface1' from source: play vars 41016 1727204204.43493: variable 'interface1' from source: play vars 41016 1727204204.43516: variable 'interface1_mac' from source: set_fact 41016 1727204204.43543: variable '__network_packages_default_team' from source: role '' defaults 41016 1727204204.43614: variable '__network_team_connections_defined' from source: role '' defaults 41016 1727204204.43866: variable 'network_connections' from source: task vars 41016 1727204204.43869: variable 'interface1' from source: play vars 41016 1727204204.43935: variable 'interface1' from source: play vars 41016 1727204204.44016: variable 'interface1_mac' from source: set_fact 41016 1727204204.44078: variable '__network_service_name_default_initscripts' from source: role '' defaults 41016 1727204204.44136: variable '__network_service_name_default_initscripts' from source: role '' defaults 41016 1727204204.44140: variable '__network_packages_default_initscripts' from source: role '' defaults 41016 1727204204.44198: variable '__network_packages_default_initscripts' from source: role '' defaults 41016 1727204204.44402: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 41016 1727204204.44859: variable 'network_connections' from source: task vars 41016 1727204204.44862: variable 'interface1' from source: play vars 41016 1727204204.44925: variable 'interface1' from source: play vars 41016 1727204204.44994: variable 'interface1_mac' from source: set_fact 41016 1727204204.45006: variable 'ansible_distribution' from source: facts 41016 1727204204.45012: variable '__network_rh_distros' from source: role '' defaults 41016 1727204204.45015: variable 'ansible_distribution_major_version' from source: facts 41016 1727204204.45036: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 41016 1727204204.45203: variable 'ansible_distribution' from source: facts 41016 1727204204.45207: variable '__network_rh_distros' from source: role '' defaults 41016 1727204204.45225: variable 'ansible_distribution_major_version' from source: facts 41016 1727204204.45228: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 41016 1727204204.45405: variable 'ansible_distribution' from source: facts 41016 1727204204.45408: variable '__network_rh_distros' from source: role '' defaults 41016 1727204204.45413: variable 'ansible_distribution_major_version' from source: facts 41016 1727204204.45480: variable 'network_provider' from source: set_fact 41016 1727204204.45483: variable 'omit' from source: magic vars 41016 1727204204.45491: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204204.45518: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204204.45536: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204204.45609: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204204.45614: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204204.45617: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204204.45619: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204204.45621: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204204.45690: Set connection var ansible_shell_executable to /bin/sh 41016 1727204204.45694: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204204.45701: Set connection var ansible_shell_type to sh 41016 1727204204.45706: Set connection var ansible_timeout to 10 41016 1727204204.45716: Set connection var ansible_pipelining to False 41016 1727204204.45719: Set connection var ansible_connection to ssh 41016 1727204204.45742: variable 'ansible_shell_executable' from source: unknown 41016 1727204204.45745: variable 'ansible_connection' from source: unknown 41016 1727204204.45747: variable 'ansible_module_compression' from source: unknown 41016 1727204204.45749: variable 'ansible_shell_type' from source: unknown 41016 1727204204.45752: variable 'ansible_shell_executable' from source: unknown 41016 1727204204.45767: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204204.45769: variable 'ansible_pipelining' from source: unknown 41016 1727204204.45772: variable 'ansible_timeout' from source: unknown 41016 1727204204.45774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204204.45879: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204204.45883: variable 'omit' from source: magic vars 41016 1727204204.45885: starting attempt loop 41016 1727204204.45887: running the handler 41016 1727204204.45982: variable 'ansible_facts' from source: unknown 41016 1727204204.46591: _low_level_execute_command(): starting 41016 1727204204.46597: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204204.47283: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204204.47297: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204204.47308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204204.47322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204204.47336: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204204.47344: stderr chunk (state=3): >>>debug2: match not found <<< 41016 1727204204.47352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204204.47366: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41016 1727204204.47374: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 41016 1727204204.47391: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41016 1727204204.47394: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204204.47453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204204.47456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204204.47459: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204204.47461: stderr chunk (state=3): >>>debug2: match found <<< 41016 1727204204.47463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204204.47482: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204204.47499: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204204.47532: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204204.47615: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204204.49391: stdout chunk (state=3): >>>/root <<< 41016 1727204204.49563: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204204.49567: stdout chunk (state=3): >>><<< 41016 1727204204.49569: stderr chunk (state=3): >>><<< 41016 1727204204.49762: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204204.49764: _low_level_execute_command(): starting 41016 1727204204.49766: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204204.495924-42992-245874342437091 `" && echo ansible-tmp-1727204204.495924-42992-245874342437091="` echo /root/.ansible/tmp/ansible-tmp-1727204204.495924-42992-245874342437091 `" ) && sleep 0' 41016 1727204204.50582: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204204.50597: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204204.50619: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204204.50740: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204204.52857: stdout chunk (state=3): >>>ansible-tmp-1727204204.495924-42992-245874342437091=/root/.ansible/tmp/ansible-tmp-1727204204.495924-42992-245874342437091 <<< 41016 1727204204.53029: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204204.53032: stdout chunk (state=3): >>><<< 41016 1727204204.53035: stderr chunk (state=3): >>><<< 41016 1727204204.53052: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204204.495924-42992-245874342437091=/root/.ansible/tmp/ansible-tmp-1727204204.495924-42992-245874342437091 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204204.53182: variable 'ansible_module_compression' from source: unknown 41016 1727204204.53189: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-410168h8uvyln/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 41016 1727204204.53234: variable 'ansible_facts' from source: unknown 41016 1727204204.53446: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204204.495924-42992-245874342437091/AnsiballZ_systemd.py 41016 1727204204.53696: Sending initial data 41016 1727204204.53699: Sent initial data (155 bytes) 41016 1727204204.54235: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204204.54292: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204204.54357: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204204.54374: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204204.54401: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204204.54517: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204204.56290: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204204.56369: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204204.56456: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmphjt77qyz /root/.ansible/tmp/ansible-tmp-1727204204.495924-42992-245874342437091/AnsiballZ_systemd.py <<< 41016 1727204204.56459: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204204.495924-42992-245874342437091/AnsiballZ_systemd.py" <<< 41016 1727204204.56529: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmphjt77qyz" to remote "/root/.ansible/tmp/ansible-tmp-1727204204.495924-42992-245874342437091/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204204.495924-42992-245874342437091/AnsiballZ_systemd.py" <<< 41016 1727204204.58091: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204204.58185: stderr chunk (state=3): >>><<< 41016 1727204204.58189: stdout chunk (state=3): >>><<< 41016 1727204204.58191: done transferring module to remote 41016 1727204204.58193: _low_level_execute_command(): starting 41016 1727204204.58195: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204204.495924-42992-245874342437091/ /root/.ansible/tmp/ansible-tmp-1727204204.495924-42992-245874342437091/AnsiballZ_systemd.py && sleep 0' 41016 1727204204.58600: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204204.58603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 41016 1727204204.58606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 41016 1727204204.58608: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204204.58614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204204.58658: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204204.58662: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204204.58748: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204204.60795: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204204.60799: stdout chunk (state=3): >>><<< 41016 1727204204.60801: stderr chunk (state=3): >>><<< 41016 1727204204.60900: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204204.60904: _low_level_execute_command(): starting 41016 1727204204.60907: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204204.495924-42992-245874342437091/AnsiballZ_systemd.py && sleep 0' 41016 1727204204.61490: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204204.61539: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204204.61554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 41016 1727204204.61574: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204204.61658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204204.61679: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204204.61695: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204204.61717: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204204.61838: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204204.93224: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainStartTimestampMonotonic": "33322039", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainHandoffTimestampMonotonic": "33336258", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10747904", "MemoryPeak": "13869056", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3297878016", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1609043000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target shutdown.target multi-user.target", "After": "network-pre.target sysinit.target system.slice basic.target dbus.socket systemd-journald.socket cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:22 EDT", "StateChangeTimestampMonotonic": "413618667", "InactiveExitTimestamp": "Tue 2024-09-24 14:44:02 EDT", "InactiveExitTimestampMonotonic": "33322542", "ActiveEnterTimestamp": "Tue 2024-09-24 14:44:03 EDT", "ActiveEnterTimestampMonotonic": "34680535", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ConditionTimestampMonotonic": "33321151", "AssertTimestamp": "Tue 2024-09-24 14:44:02 EDT", "AssertTimestampMonotonic": "33321155", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "53c91cc8356748b484feba73dc5ee144", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 41016 1727204204.95912: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 41016 1727204204.95916: stdout chunk (state=3): >>><<< 41016 1727204204.95918: stderr chunk (state=3): >>><<< 41016 1727204204.95921: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainStartTimestampMonotonic": "33322039", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainHandoffTimestampMonotonic": "33336258", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10747904", "MemoryPeak": "13869056", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3297878016", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1609043000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target shutdown.target multi-user.target", "After": "network-pre.target sysinit.target system.slice basic.target dbus.socket systemd-journald.socket cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:22 EDT", "StateChangeTimestampMonotonic": "413618667", "InactiveExitTimestamp": "Tue 2024-09-24 14:44:02 EDT", "InactiveExitTimestampMonotonic": "33322542", "ActiveEnterTimestamp": "Tue 2024-09-24 14:44:03 EDT", "ActiveEnterTimestampMonotonic": "34680535", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ConditionTimestampMonotonic": "33321151", "AssertTimestamp": "Tue 2024-09-24 14:44:02 EDT", "AssertTimestampMonotonic": "33321155", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "53c91cc8356748b484feba73dc5ee144", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204204.95928: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204204.495924-42992-245874342437091/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204204.96029: _low_level_execute_command(): starting 41016 1727204204.96047: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204204.495924-42992-245874342437091/ > /dev/null 2>&1 && sleep 0' 41016 1727204204.96823: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204204.96836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204204.96847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204204.96899: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204204.96916: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204204.96956: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204204.97082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204204.99221: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204204.99224: stdout chunk (state=3): >>><<< 41016 1727204204.99227: stderr chunk (state=3): >>><<< 41016 1727204204.99381: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204204.99384: handler run complete 41016 1727204204.99386: attempt loop complete, returning result 41016 1727204204.99388: _execute() done 41016 1727204204.99390: dumping result to json 41016 1727204204.99392: done dumping result, returning 41016 1727204204.99394: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [028d2410-947f-12d5-0ec4-000000000078] 41016 1727204204.99396: sending task result for task 028d2410-947f-12d5-0ec4-000000000078 ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41016 1727204204.99728: no more pending results, returning what we have 41016 1727204204.99732: results queue empty 41016 1727204204.99733: checking for any_errors_fatal 41016 1727204204.99742: done checking for any_errors_fatal 41016 1727204204.99742: checking for max_fail_percentage 41016 1727204204.99744: done checking for max_fail_percentage 41016 1727204204.99745: checking to see if all hosts have failed and the running result is not ok 41016 1727204204.99746: done checking to see if all hosts have failed 41016 1727204204.99746: getting the remaining hosts for this loop 41016 1727204204.99748: done getting the remaining hosts for this loop 41016 1727204204.99751: getting the next task for host managed-node1 41016 1727204204.99759: done getting next task for host managed-node1 41016 1727204204.99762: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 41016 1727204204.99765: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204204.99777: getting variables 41016 1727204204.99779: in VariableManager get_vars() 41016 1727204204.99815: Calling all_inventory to load vars for managed-node1 41016 1727204204.99818: Calling groups_inventory to load vars for managed-node1 41016 1727204204.99820: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204204.99829: Calling all_plugins_play to load vars for managed-node1 41016 1727204204.99832: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204204.99834: Calling groups_plugins_play to load vars for managed-node1 41016 1727204205.01158: done sending task result for task 028d2410-947f-12d5-0ec4-000000000078 41016 1727204205.01162: WORKER PROCESS EXITING 41016 1727204205.01971: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204205.04124: done with get_vars() 41016 1727204205.04151: done getting variables 41016 1727204205.04219: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:56:45 -0400 (0:00:00.680) 0:00:28.718 ***** 41016 1727204205.04255: entering _queue_task() for managed-node1/service 41016 1727204205.04839: worker is 1 (out of 1 available) 41016 1727204205.04853: exiting _queue_task() for managed-node1/service 41016 1727204205.04867: done queuing things up, now waiting for results queue to drain 41016 1727204205.04868: waiting for pending results... 41016 1727204205.05447: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 41016 1727204205.05663: in run() - task 028d2410-947f-12d5-0ec4-000000000079 41016 1727204205.05687: variable 'ansible_search_path' from source: unknown 41016 1727204205.05695: variable 'ansible_search_path' from source: unknown 41016 1727204205.05744: calling self._execute() 41016 1727204205.05871: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204205.05887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204205.05902: variable 'omit' from source: magic vars 41016 1727204205.06305: variable 'ansible_distribution_major_version' from source: facts 41016 1727204205.06327: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204205.06457: variable 'network_provider' from source: set_fact 41016 1727204205.06472: Evaluated conditional (network_provider == "nm"): True 41016 1727204205.06578: variable '__network_wpa_supplicant_required' from source: role '' defaults 41016 1727204205.06666: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41016 1727204205.06841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41016 1727204205.09058: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41016 1727204205.09159: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41016 1727204205.09281: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41016 1727204205.09361: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41016 1727204205.09364: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41016 1727204205.09450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204205.09491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204205.09525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204205.09884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204205.09888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204205.09991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204205.09994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204205.09997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204205.09998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204205.10015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204205.10061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204205.10198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204205.10239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204205.10328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204205.10348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204205.10687: variable 'network_connections' from source: task vars 41016 1727204205.10771: variable 'interface1' from source: play vars 41016 1727204205.10862: variable 'interface1' from source: play vars 41016 1727204205.11040: variable 'interface1_mac' from source: set_fact 41016 1727204205.11153: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41016 1727204205.11349: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41016 1727204205.11392: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41016 1727204205.11432: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41016 1727204205.11465: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41016 1727204205.11520: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41016 1727204205.11548: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41016 1727204205.11577: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204205.11606: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41016 1727204205.11661: variable '__network_wireless_connections_defined' from source: role '' defaults 41016 1727204205.11933: variable 'network_connections' from source: task vars 41016 1727204205.11948: variable 'interface1' from source: play vars 41016 1727204205.12015: variable 'interface1' from source: play vars 41016 1727204205.12097: variable 'interface1_mac' from source: set_fact 41016 1727204205.12149: Evaluated conditional (__network_wpa_supplicant_required): False 41016 1727204205.12165: when evaluation is False, skipping this task 41016 1727204205.12168: _execute() done 41016 1727204205.12276: dumping result to json 41016 1727204205.12279: done dumping result, returning 41016 1727204205.12290: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [028d2410-947f-12d5-0ec4-000000000079] 41016 1727204205.12292: sending task result for task 028d2410-947f-12d5-0ec4-000000000079 41016 1727204205.12363: done sending task result for task 028d2410-947f-12d5-0ec4-000000000079 41016 1727204205.12366: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 41016 1727204205.12425: no more pending results, returning what we have 41016 1727204205.12429: results queue empty 41016 1727204205.12431: checking for any_errors_fatal 41016 1727204205.12452: done checking for any_errors_fatal 41016 1727204205.12453: checking for max_fail_percentage 41016 1727204205.12454: done checking for max_fail_percentage 41016 1727204205.12455: checking to see if all hosts have failed and the running result is not ok 41016 1727204205.12456: done checking to see if all hosts have failed 41016 1727204205.12457: getting the remaining hosts for this loop 41016 1727204205.12458: done getting the remaining hosts for this loop 41016 1727204205.12462: getting the next task for host managed-node1 41016 1727204205.12470: done getting next task for host managed-node1 41016 1727204205.12475: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 41016 1727204205.12480: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204205.12498: getting variables 41016 1727204205.12500: in VariableManager get_vars() 41016 1727204205.12547: Calling all_inventory to load vars for managed-node1 41016 1727204205.12550: Calling groups_inventory to load vars for managed-node1 41016 1727204205.12553: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204205.12562: Calling all_plugins_play to load vars for managed-node1 41016 1727204205.12565: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204205.12568: Calling groups_plugins_play to load vars for managed-node1 41016 1727204205.14969: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204205.16728: done with get_vars() 41016 1727204205.16750: done getting variables 41016 1727204205.16812: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:56:45 -0400 (0:00:00.125) 0:00:28.844 ***** 41016 1727204205.16844: entering _queue_task() for managed-node1/service 41016 1727204205.17199: worker is 1 (out of 1 available) 41016 1727204205.17216: exiting _queue_task() for managed-node1/service 41016 1727204205.17230: done queuing things up, now waiting for results queue to drain 41016 1727204205.17232: waiting for pending results... 41016 1727204205.17597: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service 41016 1727204205.17616: in run() - task 028d2410-947f-12d5-0ec4-00000000007a 41016 1727204205.17630: variable 'ansible_search_path' from source: unknown 41016 1727204205.17634: variable 'ansible_search_path' from source: unknown 41016 1727204205.17673: calling self._execute() 41016 1727204205.17772: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204205.17777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204205.17788: variable 'omit' from source: magic vars 41016 1727204205.18174: variable 'ansible_distribution_major_version' from source: facts 41016 1727204205.18380: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204205.18383: variable 'network_provider' from source: set_fact 41016 1727204205.18385: Evaluated conditional (network_provider == "initscripts"): False 41016 1727204205.18387: when evaluation is False, skipping this task 41016 1727204205.18388: _execute() done 41016 1727204205.18390: dumping result to json 41016 1727204205.18392: done dumping result, returning 41016 1727204205.18394: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service [028d2410-947f-12d5-0ec4-00000000007a] 41016 1727204205.18396: sending task result for task 028d2410-947f-12d5-0ec4-00000000007a 41016 1727204205.18458: done sending task result for task 028d2410-947f-12d5-0ec4-00000000007a 41016 1727204205.18461: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41016 1727204205.18506: no more pending results, returning what we have 41016 1727204205.18512: results queue empty 41016 1727204205.18513: checking for any_errors_fatal 41016 1727204205.18523: done checking for any_errors_fatal 41016 1727204205.18524: checking for max_fail_percentage 41016 1727204205.18525: done checking for max_fail_percentage 41016 1727204205.18526: checking to see if all hosts have failed and the running result is not ok 41016 1727204205.18527: done checking to see if all hosts have failed 41016 1727204205.18527: getting the remaining hosts for this loop 41016 1727204205.18528: done getting the remaining hosts for this loop 41016 1727204205.18532: getting the next task for host managed-node1 41016 1727204205.18538: done getting next task for host managed-node1 41016 1727204205.18542: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 41016 1727204205.18545: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204205.18565: getting variables 41016 1727204205.18567: in VariableManager get_vars() 41016 1727204205.18606: Calling all_inventory to load vars for managed-node1 41016 1727204205.18609: Calling groups_inventory to load vars for managed-node1 41016 1727204205.18613: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204205.18623: Calling all_plugins_play to load vars for managed-node1 41016 1727204205.18626: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204205.18628: Calling groups_plugins_play to load vars for managed-node1 41016 1727204205.19922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204205.21527: done with get_vars() 41016 1727204205.21557: done getting variables 41016 1727204205.21616: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:56:45 -0400 (0:00:00.048) 0:00:28.892 ***** 41016 1727204205.21649: entering _queue_task() for managed-node1/copy 41016 1727204205.22100: worker is 1 (out of 1 available) 41016 1727204205.22112: exiting _queue_task() for managed-node1/copy 41016 1727204205.22122: done queuing things up, now waiting for results queue to drain 41016 1727204205.22124: waiting for pending results... 41016 1727204205.22316: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 41016 1727204205.22581: in run() - task 028d2410-947f-12d5-0ec4-00000000007b 41016 1727204205.22585: variable 'ansible_search_path' from source: unknown 41016 1727204205.22588: variable 'ansible_search_path' from source: unknown 41016 1727204205.22591: calling self._execute() 41016 1727204205.22594: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204205.22596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204205.22602: variable 'omit' from source: magic vars 41016 1727204205.23002: variable 'ansible_distribution_major_version' from source: facts 41016 1727204205.23015: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204205.23136: variable 'network_provider' from source: set_fact 41016 1727204205.23140: Evaluated conditional (network_provider == "initscripts"): False 41016 1727204205.23142: when evaluation is False, skipping this task 41016 1727204205.23147: _execute() done 41016 1727204205.23150: dumping result to json 41016 1727204205.23155: done dumping result, returning 41016 1727204205.23163: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [028d2410-947f-12d5-0ec4-00000000007b] 41016 1727204205.23166: sending task result for task 028d2410-947f-12d5-0ec4-00000000007b 41016 1727204205.23266: done sending task result for task 028d2410-947f-12d5-0ec4-00000000007b 41016 1727204205.23270: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 41016 1727204205.23417: no more pending results, returning what we have 41016 1727204205.23420: results queue empty 41016 1727204205.23421: checking for any_errors_fatal 41016 1727204205.23427: done checking for any_errors_fatal 41016 1727204205.23428: checking for max_fail_percentage 41016 1727204205.23430: done checking for max_fail_percentage 41016 1727204205.23431: checking to see if all hosts have failed and the running result is not ok 41016 1727204205.23431: done checking to see if all hosts have failed 41016 1727204205.23432: getting the remaining hosts for this loop 41016 1727204205.23434: done getting the remaining hosts for this loop 41016 1727204205.23437: getting the next task for host managed-node1 41016 1727204205.23443: done getting next task for host managed-node1 41016 1727204205.23447: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 41016 1727204205.23451: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204205.23470: getting variables 41016 1727204205.23471: in VariableManager get_vars() 41016 1727204205.23627: Calling all_inventory to load vars for managed-node1 41016 1727204205.23629: Calling groups_inventory to load vars for managed-node1 41016 1727204205.23632: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204205.23641: Calling all_plugins_play to load vars for managed-node1 41016 1727204205.23644: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204205.23647: Calling groups_plugins_play to load vars for managed-node1 41016 1727204205.25256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204205.26798: done with get_vars() 41016 1727204205.26821: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:56:45 -0400 (0:00:00.052) 0:00:28.944 ***** 41016 1727204205.26886: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 41016 1727204205.27152: worker is 1 (out of 1 available) 41016 1727204205.27166: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 41016 1727204205.27181: done queuing things up, now waiting for results queue to drain 41016 1727204205.27182: waiting for pending results... 41016 1727204205.27363: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 41016 1727204205.27460: in run() - task 028d2410-947f-12d5-0ec4-00000000007c 41016 1727204205.27472: variable 'ansible_search_path' from source: unknown 41016 1727204205.27477: variable 'ansible_search_path' from source: unknown 41016 1727204205.27505: calling self._execute() 41016 1727204205.27582: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204205.27585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204205.27595: variable 'omit' from source: magic vars 41016 1727204205.27885: variable 'ansible_distribution_major_version' from source: facts 41016 1727204205.27894: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204205.27900: variable 'omit' from source: magic vars 41016 1727204205.27938: variable 'omit' from source: magic vars 41016 1727204205.28052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41016 1727204205.29930: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41016 1727204205.29977: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41016 1727204205.30009: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41016 1727204205.30034: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41016 1727204205.30053: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41016 1727204205.30113: variable 'network_provider' from source: set_fact 41016 1727204205.30206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204205.30242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204205.30259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204205.30287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204205.30299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204205.30355: variable 'omit' from source: magic vars 41016 1727204205.30433: variable 'omit' from source: magic vars 41016 1727204205.30506: variable 'network_connections' from source: task vars 41016 1727204205.30518: variable 'interface1' from source: play vars 41016 1727204205.30565: variable 'interface1' from source: play vars 41016 1727204205.30620: variable 'interface1_mac' from source: set_fact 41016 1727204205.30744: variable 'omit' from source: magic vars 41016 1727204205.30750: variable '__lsr_ansible_managed' from source: task vars 41016 1727204205.30797: variable '__lsr_ansible_managed' from source: task vars 41016 1727204205.30980: Loaded config def from plugin (lookup/template) 41016 1727204205.30984: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 41016 1727204205.31008: File lookup term: get_ansible_managed.j2 41016 1727204205.31012: variable 'ansible_search_path' from source: unknown 41016 1727204205.31020: evaluation_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 41016 1727204205.31030: search_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 41016 1727204205.31043: variable 'ansible_search_path' from source: unknown 41016 1727204205.35739: variable 'ansible_managed' from source: unknown 41016 1727204205.35848: variable 'omit' from source: magic vars 41016 1727204205.35872: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204205.35918: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204205.35921: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204205.35967: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204205.35970: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204205.35973: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204205.35977: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204205.35980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204205.36071: Set connection var ansible_shell_executable to /bin/sh 41016 1727204205.36074: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204205.36079: Set connection var ansible_shell_type to sh 41016 1727204205.36081: Set connection var ansible_timeout to 10 41016 1727204205.36086: Set connection var ansible_pipelining to False 41016 1727204205.36173: Set connection var ansible_connection to ssh 41016 1727204205.36179: variable 'ansible_shell_executable' from source: unknown 41016 1727204205.36182: variable 'ansible_connection' from source: unknown 41016 1727204205.36184: variable 'ansible_module_compression' from source: unknown 41016 1727204205.36186: variable 'ansible_shell_type' from source: unknown 41016 1727204205.36188: variable 'ansible_shell_executable' from source: unknown 41016 1727204205.36191: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204205.36193: variable 'ansible_pipelining' from source: unknown 41016 1727204205.36195: variable 'ansible_timeout' from source: unknown 41016 1727204205.36197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204205.36259: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 41016 1727204205.36481: variable 'omit' from source: magic vars 41016 1727204205.36486: starting attempt loop 41016 1727204205.36488: running the handler 41016 1727204205.36491: _low_level_execute_command(): starting 41016 1727204205.36493: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204205.36920: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204205.36940: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204205.36943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204205.36957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204205.36969: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204205.36977: stderr chunk (state=3): >>>debug2: match not found <<< 41016 1727204205.36987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204205.37001: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41016 1727204205.37011: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 41016 1727204205.37015: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41016 1727204205.37035: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204205.37046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204205.37052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204205.37055: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204205.37057: stderr chunk (state=3): >>>debug2: match found <<< 41016 1727204205.37068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204205.37138: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204205.37184: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204205.37187: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204205.37287: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204205.39101: stdout chunk (state=3): >>>/root <<< 41016 1727204205.39248: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204205.39251: stdout chunk (state=3): >>><<< 41016 1727204205.39254: stderr chunk (state=3): >>><<< 41016 1727204205.39359: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204205.39365: _low_level_execute_command(): starting 41016 1727204205.39368: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204205.3927906-43033-105767738882657 `" && echo ansible-tmp-1727204205.3927906-43033-105767738882657="` echo /root/.ansible/tmp/ansible-tmp-1727204205.3927906-43033-105767738882657 `" ) && sleep 0' 41016 1727204205.39810: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204205.39826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204205.39831: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204205.39861: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 41016 1727204205.39865: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 41016 1727204205.39867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204205.39919: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204205.39923: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204205.40015: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204205.42132: stdout chunk (state=3): >>>ansible-tmp-1727204205.3927906-43033-105767738882657=/root/.ansible/tmp/ansible-tmp-1727204205.3927906-43033-105767738882657 <<< 41016 1727204205.42238: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204205.42270: stderr chunk (state=3): >>><<< 41016 1727204205.42272: stdout chunk (state=3): >>><<< 41016 1727204205.42289: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204205.3927906-43033-105767738882657=/root/.ansible/tmp/ansible-tmp-1727204205.3927906-43033-105767738882657 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204205.42381: variable 'ansible_module_compression' from source: unknown 41016 1727204205.42386: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-410168h8uvyln/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 41016 1727204205.42393: variable 'ansible_facts' from source: unknown 41016 1727204205.42463: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204205.3927906-43033-105767738882657/AnsiballZ_network_connections.py 41016 1727204205.42564: Sending initial data 41016 1727204205.42567: Sent initial data (168 bytes) 41016 1727204205.43004: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204205.43007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 41016 1727204205.43010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204205.43012: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204205.43014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204205.43067: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204205.43076: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204205.43151: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204205.44877: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 41016 1727204205.44882: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204205.44947: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204205.45026: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmpczi8cpai /root/.ansible/tmp/ansible-tmp-1727204205.3927906-43033-105767738882657/AnsiballZ_network_connections.py <<< 41016 1727204205.45029: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204205.3927906-43033-105767738882657/AnsiballZ_network_connections.py" <<< 41016 1727204205.45107: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmpczi8cpai" to remote "/root/.ansible/tmp/ansible-tmp-1727204205.3927906-43033-105767738882657/AnsiballZ_network_connections.py" <<< 41016 1727204205.45110: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204205.3927906-43033-105767738882657/AnsiballZ_network_connections.py" <<< 41016 1727204205.45949: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204205.45993: stderr chunk (state=3): >>><<< 41016 1727204205.45997: stdout chunk (state=3): >>><<< 41016 1727204205.46030: done transferring module to remote 41016 1727204205.46039: _low_level_execute_command(): starting 41016 1727204205.46044: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204205.3927906-43033-105767738882657/ /root/.ansible/tmp/ansible-tmp-1727204205.3927906-43033-105767738882657/AnsiballZ_network_connections.py && sleep 0' 41016 1727204205.46464: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204205.46472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204205.46498: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204205.46501: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204205.46503: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204205.46561: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204205.46564: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204205.46568: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204205.46646: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204205.48588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204205.48616: stderr chunk (state=3): >>><<< 41016 1727204205.48619: stdout chunk (state=3): >>><<< 41016 1727204205.48631: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204205.48634: _low_level_execute_command(): starting 41016 1727204205.48638: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204205.3927906-43033-105767738882657/AnsiballZ_network_connections.py && sleep 0' 41016 1727204205.49070: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204205.49073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 41016 1727204205.49082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204205.49084: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204205.49086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204205.49136: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204205.49140: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204205.49144: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204205.49230: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204205.79322: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, af2476db-1e3b-4f5e-ab84-23db91da8d4b\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest1", "mac": "ca:90:ed:ea:28:3e", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.4/24", "2001:db8::6/32"], "route": [{"network": "198.58.10.64", "prefix": 26, "gateway": "198.51.100.102", "metric": 4}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest1", "mac": "ca:90:ed:ea:28:3e", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.4/24", "2001:db8::6/32"], "route": [{"network": "198.58.10.64", "prefix": 26, "gateway": "198.51.100.102", "metric": 4}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 41016 1727204205.81298: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 41016 1727204205.81324: stderr chunk (state=3): >>><<< 41016 1727204205.81327: stdout chunk (state=3): >>><<< 41016 1727204205.81343: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, af2476db-1e3b-4f5e-ab84-23db91da8d4b\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest1", "mac": "ca:90:ed:ea:28:3e", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.4/24", "2001:db8::6/32"], "route": [{"network": "198.58.10.64", "prefix": 26, "gateway": "198.51.100.102", "metric": 4}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest1", "mac": "ca:90:ed:ea:28:3e", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.4/24", "2001:db8::6/32"], "route": [{"network": "198.58.10.64", "prefix": 26, "gateway": "198.51.100.102", "metric": 4}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204205.81379: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest1', 'mac': 'ca:90:ed:ea:28:3e', 'type': 'ethernet', 'autoconnect': False, 'ip': {'address': ['198.51.100.4/24', '2001:db8::6/32'], 'route': [{'network': '198.58.10.64', 'prefix': 26, 'gateway': '198.51.100.102', 'metric': 4}]}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204205.3927906-43033-105767738882657/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204205.81387: _low_level_execute_command(): starting 41016 1727204205.81395: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204205.3927906-43033-105767738882657/ > /dev/null 2>&1 && sleep 0' 41016 1727204205.82024: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204205.82085: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204205.82099: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204205.82136: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204205.82152: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204205.82173: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204205.82305: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204205.84273: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204205.84294: stderr chunk (state=3): >>><<< 41016 1727204205.84297: stdout chunk (state=3): >>><<< 41016 1727204205.84312: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204205.84321: handler run complete 41016 1727204205.84357: attempt loop complete, returning result 41016 1727204205.84360: _execute() done 41016 1727204205.84362: dumping result to json 41016 1727204205.84368: done dumping result, returning 41016 1727204205.84382: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [028d2410-947f-12d5-0ec4-00000000007c] 41016 1727204205.84385: sending task result for task 028d2410-947f-12d5-0ec4-00000000007c 41016 1727204205.84491: done sending task result for task 028d2410-947f-12d5-0ec4-00000000007c 41016 1727204205.84494: WORKER PROCESS EXITING changed: [managed-node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "address": [ "198.51.100.4/24", "2001:db8::6/32" ], "route": [ { "gateway": "198.51.100.102", "metric": 4, "network": "198.58.10.64", "prefix": 26 } ] }, "mac": "ca:90:ed:ea:28:3e", "name": "ethtest1", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, af2476db-1e3b-4f5e-ab84-23db91da8d4b 41016 1727204205.84668: no more pending results, returning what we have 41016 1727204205.84672: results queue empty 41016 1727204205.84673: checking for any_errors_fatal 41016 1727204205.84680: done checking for any_errors_fatal 41016 1727204205.84681: checking for max_fail_percentage 41016 1727204205.84682: done checking for max_fail_percentage 41016 1727204205.84683: checking to see if all hosts have failed and the running result is not ok 41016 1727204205.84684: done checking to see if all hosts have failed 41016 1727204205.84684: getting the remaining hosts for this loop 41016 1727204205.84686: done getting the remaining hosts for this loop 41016 1727204205.84689: getting the next task for host managed-node1 41016 1727204205.84694: done getting next task for host managed-node1 41016 1727204205.84698: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 41016 1727204205.84700: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204205.84712: getting variables 41016 1727204205.84713: in VariableManager get_vars() 41016 1727204205.84749: Calling all_inventory to load vars for managed-node1 41016 1727204205.84751: Calling groups_inventory to load vars for managed-node1 41016 1727204205.84754: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204205.84761: Calling all_plugins_play to load vars for managed-node1 41016 1727204205.84764: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204205.84766: Calling groups_plugins_play to load vars for managed-node1 41016 1727204205.86168: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204205.87084: done with get_vars() 41016 1727204205.87100: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:56:45 -0400 (0:00:00.602) 0:00:29.547 ***** 41016 1727204205.87162: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_state 41016 1727204205.87402: worker is 1 (out of 1 available) 41016 1727204205.87415: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_state 41016 1727204205.87428: done queuing things up, now waiting for results queue to drain 41016 1727204205.87429: waiting for pending results... 41016 1727204205.87614: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state 41016 1727204205.87699: in run() - task 028d2410-947f-12d5-0ec4-00000000007d 41016 1727204205.87710: variable 'ansible_search_path' from source: unknown 41016 1727204205.87715: variable 'ansible_search_path' from source: unknown 41016 1727204205.87745: calling self._execute() 41016 1727204205.87827: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204205.87831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204205.87887: variable 'omit' from source: magic vars 41016 1727204205.88382: variable 'ansible_distribution_major_version' from source: facts 41016 1727204205.88385: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204205.88387: variable 'network_state' from source: role '' defaults 41016 1727204205.88394: Evaluated conditional (network_state != {}): False 41016 1727204205.88397: when evaluation is False, skipping this task 41016 1727204205.88399: _execute() done 41016 1727204205.88401: dumping result to json 41016 1727204205.88403: done dumping result, returning 41016 1727204205.88405: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state [028d2410-947f-12d5-0ec4-00000000007d] 41016 1727204205.88407: sending task result for task 028d2410-947f-12d5-0ec4-00000000007d 41016 1727204205.88491: done sending task result for task 028d2410-947f-12d5-0ec4-00000000007d skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41016 1727204205.88546: no more pending results, returning what we have 41016 1727204205.88551: results queue empty 41016 1727204205.88552: checking for any_errors_fatal 41016 1727204205.88561: done checking for any_errors_fatal 41016 1727204205.88562: checking for max_fail_percentage 41016 1727204205.88563: done checking for max_fail_percentage 41016 1727204205.88564: checking to see if all hosts have failed and the running result is not ok 41016 1727204205.88565: done checking to see if all hosts have failed 41016 1727204205.88566: getting the remaining hosts for this loop 41016 1727204205.88567: done getting the remaining hosts for this loop 41016 1727204205.88571: getting the next task for host managed-node1 41016 1727204205.88581: done getting next task for host managed-node1 41016 1727204205.88585: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 41016 1727204205.88588: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204205.88612: getting variables 41016 1727204205.88614: in VariableManager get_vars() 41016 1727204205.88652: Calling all_inventory to load vars for managed-node1 41016 1727204205.88655: Calling groups_inventory to load vars for managed-node1 41016 1727204205.88657: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204205.88669: Calling all_plugins_play to load vars for managed-node1 41016 1727204205.88672: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204205.88676: Calling groups_plugins_play to load vars for managed-node1 41016 1727204205.88836: WORKER PROCESS EXITING 41016 1727204205.90070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204205.91335: done with get_vars() 41016 1727204205.91350: done getting variables 41016 1727204205.91395: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:56:45 -0400 (0:00:00.042) 0:00:29.590 ***** 41016 1727204205.91422: entering _queue_task() for managed-node1/debug 41016 1727204205.91653: worker is 1 (out of 1 available) 41016 1727204205.91665: exiting _queue_task() for managed-node1/debug 41016 1727204205.91678: done queuing things up, now waiting for results queue to drain 41016 1727204205.91680: waiting for pending results... 41016 1727204205.91859: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 41016 1727204205.91952: in run() - task 028d2410-947f-12d5-0ec4-00000000007e 41016 1727204205.91964: variable 'ansible_search_path' from source: unknown 41016 1727204205.91968: variable 'ansible_search_path' from source: unknown 41016 1727204205.91996: calling self._execute() 41016 1727204205.92065: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204205.92069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204205.92078: variable 'omit' from source: magic vars 41016 1727204205.92361: variable 'ansible_distribution_major_version' from source: facts 41016 1727204205.92370: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204205.92377: variable 'omit' from source: magic vars 41016 1727204205.92460: variable 'omit' from source: magic vars 41016 1727204205.92463: variable 'omit' from source: magic vars 41016 1727204205.92470: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204205.92498: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204205.92515: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204205.92527: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204205.92536: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204205.92563: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204205.92566: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204205.92568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204205.92635: Set connection var ansible_shell_executable to /bin/sh 41016 1727204205.92638: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204205.92644: Set connection var ansible_shell_type to sh 41016 1727204205.92649: Set connection var ansible_timeout to 10 41016 1727204205.92655: Set connection var ansible_pipelining to False 41016 1727204205.92661: Set connection var ansible_connection to ssh 41016 1727204205.92679: variable 'ansible_shell_executable' from source: unknown 41016 1727204205.92682: variable 'ansible_connection' from source: unknown 41016 1727204205.92684: variable 'ansible_module_compression' from source: unknown 41016 1727204205.92687: variable 'ansible_shell_type' from source: unknown 41016 1727204205.92689: variable 'ansible_shell_executable' from source: unknown 41016 1727204205.92691: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204205.92699: variable 'ansible_pipelining' from source: unknown 41016 1727204205.92702: variable 'ansible_timeout' from source: unknown 41016 1727204205.92704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204205.92988: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204205.92992: variable 'omit' from source: magic vars 41016 1727204205.92994: starting attempt loop 41016 1727204205.92997: running the handler 41016 1727204205.92999: variable '__network_connections_result' from source: set_fact 41016 1727204205.93045: handler run complete 41016 1727204205.93072: attempt loop complete, returning result 41016 1727204205.93083: _execute() done 41016 1727204205.93091: dumping result to json 41016 1727204205.93099: done dumping result, returning 41016 1727204205.93118: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [028d2410-947f-12d5-0ec4-00000000007e] 41016 1727204205.93135: sending task result for task 028d2410-947f-12d5-0ec4-00000000007e 41016 1727204205.93248: done sending task result for task 028d2410-947f-12d5-0ec4-00000000007e 41016 1727204205.93257: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, af2476db-1e3b-4f5e-ab84-23db91da8d4b" ] } 41016 1727204205.93332: no more pending results, returning what we have 41016 1727204205.93336: results queue empty 41016 1727204205.93337: checking for any_errors_fatal 41016 1727204205.93347: done checking for any_errors_fatal 41016 1727204205.93348: checking for max_fail_percentage 41016 1727204205.93349: done checking for max_fail_percentage 41016 1727204205.93350: checking to see if all hosts have failed and the running result is not ok 41016 1727204205.93351: done checking to see if all hosts have failed 41016 1727204205.93352: getting the remaining hosts for this loop 41016 1727204205.93353: done getting the remaining hosts for this loop 41016 1727204205.93357: getting the next task for host managed-node1 41016 1727204205.93364: done getting next task for host managed-node1 41016 1727204205.93368: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 41016 1727204205.93371: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204205.93384: getting variables 41016 1727204205.93386: in VariableManager get_vars() 41016 1727204205.93426: Calling all_inventory to load vars for managed-node1 41016 1727204205.93429: Calling groups_inventory to load vars for managed-node1 41016 1727204205.93431: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204205.93441: Calling all_plugins_play to load vars for managed-node1 41016 1727204205.93444: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204205.93447: Calling groups_plugins_play to load vars for managed-node1 41016 1727204205.94523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204205.96183: done with get_vars() 41016 1727204205.96203: done getting variables 41016 1727204205.96263: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:56:45 -0400 (0:00:00.048) 0:00:29.639 ***** 41016 1727204205.96296: entering _queue_task() for managed-node1/debug 41016 1727204205.96613: worker is 1 (out of 1 available) 41016 1727204205.96627: exiting _queue_task() for managed-node1/debug 41016 1727204205.96640: done queuing things up, now waiting for results queue to drain 41016 1727204205.96641: waiting for pending results... 41016 1727204205.96949: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 41016 1727204205.97282: in run() - task 028d2410-947f-12d5-0ec4-00000000007f 41016 1727204205.97286: variable 'ansible_search_path' from source: unknown 41016 1727204205.97289: variable 'ansible_search_path' from source: unknown 41016 1727204205.97292: calling self._execute() 41016 1727204205.97295: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204205.97298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204205.97301: variable 'omit' from source: magic vars 41016 1727204205.97714: variable 'ansible_distribution_major_version' from source: facts 41016 1727204205.97731: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204205.97748: variable 'omit' from source: magic vars 41016 1727204205.97808: variable 'omit' from source: magic vars 41016 1727204205.97856: variable 'omit' from source: magic vars 41016 1727204205.97904: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204205.97948: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204205.97980: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204205.98004: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204205.98024: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204205.98059: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204205.98074: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204205.98095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204205.98212: Set connection var ansible_shell_executable to /bin/sh 41016 1727204205.98227: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204205.98239: Set connection var ansible_shell_type to sh 41016 1727204205.98280: Set connection var ansible_timeout to 10 41016 1727204205.98285: Set connection var ansible_pipelining to False 41016 1727204205.98292: Set connection var ansible_connection to ssh 41016 1727204205.98307: variable 'ansible_shell_executable' from source: unknown 41016 1727204205.98320: variable 'ansible_connection' from source: unknown 41016 1727204205.98330: variable 'ansible_module_compression' from source: unknown 41016 1727204205.98401: variable 'ansible_shell_type' from source: unknown 41016 1727204205.98405: variable 'ansible_shell_executable' from source: unknown 41016 1727204205.98407: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204205.98412: variable 'ansible_pipelining' from source: unknown 41016 1727204205.98414: variable 'ansible_timeout' from source: unknown 41016 1727204205.98416: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204205.98528: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204205.98547: variable 'omit' from source: magic vars 41016 1727204205.98560: starting attempt loop 41016 1727204205.98569: running the handler 41016 1727204205.98632: variable '__network_connections_result' from source: set_fact 41016 1727204205.98727: variable '__network_connections_result' from source: set_fact 41016 1727204205.98873: handler run complete 41016 1727204205.98944: attempt loop complete, returning result 41016 1727204205.98947: _execute() done 41016 1727204205.98949: dumping result to json 41016 1727204205.98951: done dumping result, returning 41016 1727204205.98954: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [028d2410-947f-12d5-0ec4-00000000007f] 41016 1727204205.98960: sending task result for task 028d2410-947f-12d5-0ec4-00000000007f 41016 1727204205.99204: done sending task result for task 028d2410-947f-12d5-0ec4-00000000007f 41016 1727204205.99208: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "address": [ "198.51.100.4/24", "2001:db8::6/32" ], "route": [ { "gateway": "198.51.100.102", "metric": 4, "network": "198.58.10.64", "prefix": 26 } ] }, "mac": "ca:90:ed:ea:28:3e", "name": "ethtest1", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, af2476db-1e3b-4f5e-ab84-23db91da8d4b\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, af2476db-1e3b-4f5e-ab84-23db91da8d4b" ] } } 41016 1727204205.99301: no more pending results, returning what we have 41016 1727204205.99305: results queue empty 41016 1727204205.99306: checking for any_errors_fatal 41016 1727204205.99314: done checking for any_errors_fatal 41016 1727204205.99314: checking for max_fail_percentage 41016 1727204205.99316: done checking for max_fail_percentage 41016 1727204205.99316: checking to see if all hosts have failed and the running result is not ok 41016 1727204205.99317: done checking to see if all hosts have failed 41016 1727204205.99318: getting the remaining hosts for this loop 41016 1727204205.99319: done getting the remaining hosts for this loop 41016 1727204205.99322: getting the next task for host managed-node1 41016 1727204205.99328: done getting next task for host managed-node1 41016 1727204205.99331: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 41016 1727204205.99333: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204205.99342: getting variables 41016 1727204205.99344: in VariableManager get_vars() 41016 1727204205.99378: Calling all_inventory to load vars for managed-node1 41016 1727204205.99386: Calling groups_inventory to load vars for managed-node1 41016 1727204205.99388: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204205.99396: Calling all_plugins_play to load vars for managed-node1 41016 1727204205.99399: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204205.99401: Calling groups_plugins_play to load vars for managed-node1 41016 1727204206.00149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204206.01019: done with get_vars() 41016 1727204206.01034: done getting variables 41016 1727204206.01077: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:56:46 -0400 (0:00:00.048) 0:00:29.687 ***** 41016 1727204206.01100: entering _queue_task() for managed-node1/debug 41016 1727204206.01328: worker is 1 (out of 1 available) 41016 1727204206.01341: exiting _queue_task() for managed-node1/debug 41016 1727204206.01352: done queuing things up, now waiting for results queue to drain 41016 1727204206.01354: waiting for pending results... 41016 1727204206.01538: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 41016 1727204206.01634: in run() - task 028d2410-947f-12d5-0ec4-000000000080 41016 1727204206.01645: variable 'ansible_search_path' from source: unknown 41016 1727204206.01648: variable 'ansible_search_path' from source: unknown 41016 1727204206.01675: calling self._execute() 41016 1727204206.01748: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204206.01752: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204206.01760: variable 'omit' from source: magic vars 41016 1727204206.02038: variable 'ansible_distribution_major_version' from source: facts 41016 1727204206.02047: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204206.02131: variable 'network_state' from source: role '' defaults 41016 1727204206.02140: Evaluated conditional (network_state != {}): False 41016 1727204206.02143: when evaluation is False, skipping this task 41016 1727204206.02146: _execute() done 41016 1727204206.02149: dumping result to json 41016 1727204206.02153: done dumping result, returning 41016 1727204206.02160: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [028d2410-947f-12d5-0ec4-000000000080] 41016 1727204206.02165: sending task result for task 028d2410-947f-12d5-0ec4-000000000080 41016 1727204206.02243: done sending task result for task 028d2410-947f-12d5-0ec4-000000000080 41016 1727204206.02246: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "network_state != {}" } 41016 1727204206.02293: no more pending results, returning what we have 41016 1727204206.02297: results queue empty 41016 1727204206.02298: checking for any_errors_fatal 41016 1727204206.02311: done checking for any_errors_fatal 41016 1727204206.02312: checking for max_fail_percentage 41016 1727204206.02314: done checking for max_fail_percentage 41016 1727204206.02315: checking to see if all hosts have failed and the running result is not ok 41016 1727204206.02315: done checking to see if all hosts have failed 41016 1727204206.02316: getting the remaining hosts for this loop 41016 1727204206.02317: done getting the remaining hosts for this loop 41016 1727204206.02321: getting the next task for host managed-node1 41016 1727204206.02327: done getting next task for host managed-node1 41016 1727204206.02330: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 41016 1727204206.02333: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204206.02350: getting variables 41016 1727204206.02352: in VariableManager get_vars() 41016 1727204206.02387: Calling all_inventory to load vars for managed-node1 41016 1727204206.02390: Calling groups_inventory to load vars for managed-node1 41016 1727204206.02392: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204206.02400: Calling all_plugins_play to load vars for managed-node1 41016 1727204206.02402: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204206.02404: Calling groups_plugins_play to load vars for managed-node1 41016 1727204206.03250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204206.04098: done with get_vars() 41016 1727204206.04113: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:56:46 -0400 (0:00:00.030) 0:00:29.717 ***** 41016 1727204206.04177: entering _queue_task() for managed-node1/ping 41016 1727204206.04388: worker is 1 (out of 1 available) 41016 1727204206.04401: exiting _queue_task() for managed-node1/ping 41016 1727204206.04411: done queuing things up, now waiting for results queue to drain 41016 1727204206.04413: waiting for pending results... 41016 1727204206.04589: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 41016 1727204206.04680: in run() - task 028d2410-947f-12d5-0ec4-000000000081 41016 1727204206.04693: variable 'ansible_search_path' from source: unknown 41016 1727204206.04696: variable 'ansible_search_path' from source: unknown 41016 1727204206.04727: calling self._execute() 41016 1727204206.04793: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204206.04796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204206.04804: variable 'omit' from source: magic vars 41016 1727204206.05073: variable 'ansible_distribution_major_version' from source: facts 41016 1727204206.05086: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204206.05089: variable 'omit' from source: magic vars 41016 1727204206.05127: variable 'omit' from source: magic vars 41016 1727204206.05152: variable 'omit' from source: magic vars 41016 1727204206.05186: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204206.05213: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204206.05229: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204206.05242: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204206.05253: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204206.05274: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204206.05279: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204206.05282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204206.05351: Set connection var ansible_shell_executable to /bin/sh 41016 1727204206.05355: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204206.05360: Set connection var ansible_shell_type to sh 41016 1727204206.05365: Set connection var ansible_timeout to 10 41016 1727204206.05371: Set connection var ansible_pipelining to False 41016 1727204206.05378: Set connection var ansible_connection to ssh 41016 1727204206.05394: variable 'ansible_shell_executable' from source: unknown 41016 1727204206.05397: variable 'ansible_connection' from source: unknown 41016 1727204206.05399: variable 'ansible_module_compression' from source: unknown 41016 1727204206.05402: variable 'ansible_shell_type' from source: unknown 41016 1727204206.05404: variable 'ansible_shell_executable' from source: unknown 41016 1727204206.05408: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204206.05410: variable 'ansible_pipelining' from source: unknown 41016 1727204206.05420: variable 'ansible_timeout' from source: unknown 41016 1727204206.05422: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204206.05561: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 41016 1727204206.05570: variable 'omit' from source: magic vars 41016 1727204206.05577: starting attempt loop 41016 1727204206.05580: running the handler 41016 1727204206.05592: _low_level_execute_command(): starting 41016 1727204206.05599: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204206.06105: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204206.06112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 41016 1727204206.06116: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204206.06163: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204206.06166: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204206.06168: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204206.06258: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204206.08040: stdout chunk (state=3): >>>/root <<< 41016 1727204206.08143: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204206.08170: stderr chunk (state=3): >>><<< 41016 1727204206.08173: stdout chunk (state=3): >>><<< 41016 1727204206.08194: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204206.08205: _low_level_execute_command(): starting 41016 1727204206.08210: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204206.081933-43066-188793440215299 `" && echo ansible-tmp-1727204206.081933-43066-188793440215299="` echo /root/.ansible/tmp/ansible-tmp-1727204206.081933-43066-188793440215299 `" ) && sleep 0' 41016 1727204206.08636: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204206.08639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204206.08642: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204206.08650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204206.08696: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204206.08699: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204206.08787: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204206.10921: stdout chunk (state=3): >>>ansible-tmp-1727204206.081933-43066-188793440215299=/root/.ansible/tmp/ansible-tmp-1727204206.081933-43066-188793440215299 <<< 41016 1727204206.11026: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204206.11048: stderr chunk (state=3): >>><<< 41016 1727204206.11051: stdout chunk (state=3): >>><<< 41016 1727204206.11064: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204206.081933-43066-188793440215299=/root/.ansible/tmp/ansible-tmp-1727204206.081933-43066-188793440215299 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204206.11107: variable 'ansible_module_compression' from source: unknown 41016 1727204206.11139: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-410168h8uvyln/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 41016 1727204206.11168: variable 'ansible_facts' from source: unknown 41016 1727204206.11226: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204206.081933-43066-188793440215299/AnsiballZ_ping.py 41016 1727204206.11322: Sending initial data 41016 1727204206.11325: Sent initial data (152 bytes) 41016 1727204206.11899: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204206.11922: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204206.12031: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204206.13794: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204206.13898: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204206.13995: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmpnigv0arh /root/.ansible/tmp/ansible-tmp-1727204206.081933-43066-188793440215299/AnsiballZ_ping.py <<< 41016 1727204206.13998: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204206.081933-43066-188793440215299/AnsiballZ_ping.py" <<< 41016 1727204206.14057: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmpnigv0arh" to remote "/root/.ansible/tmp/ansible-tmp-1727204206.081933-43066-188793440215299/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204206.081933-43066-188793440215299/AnsiballZ_ping.py" <<< 41016 1727204206.15042: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204206.15045: stdout chunk (state=3): >>><<< 41016 1727204206.15048: stderr chunk (state=3): >>><<< 41016 1727204206.15050: done transferring module to remote 41016 1727204206.15052: _low_level_execute_command(): starting 41016 1727204206.15054: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204206.081933-43066-188793440215299/ /root/.ansible/tmp/ansible-tmp-1727204206.081933-43066-188793440215299/AnsiballZ_ping.py && sleep 0' 41016 1727204206.15597: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204206.15691: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204206.15736: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204206.15751: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204206.15769: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204206.15880: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204206.17859: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204206.17873: stdout chunk (state=3): >>><<< 41016 1727204206.17890: stderr chunk (state=3): >>><<< 41016 1727204206.17995: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204206.17998: _low_level_execute_command(): starting 41016 1727204206.18001: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204206.081933-43066-188793440215299/AnsiballZ_ping.py && sleep 0' 41016 1727204206.18555: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204206.18568: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204206.18583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204206.18600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204206.18624: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204206.18635: stderr chunk (state=3): >>>debug2: match not found <<< 41016 1727204206.18647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204206.18741: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204206.18757: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204206.18771: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204206.18793: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204206.18906: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204206.35165: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 41016 1727204206.36767: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 41016 1727204206.37133: stdout chunk (state=3): >>><<< 41016 1727204206.37137: stderr chunk (state=3): >>><<< 41016 1727204206.37139: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204206.37142: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204206.081933-43066-188793440215299/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204206.37144: _low_level_execute_command(): starting 41016 1727204206.37146: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204206.081933-43066-188793440215299/ > /dev/null 2>&1 && sleep 0' 41016 1727204206.38192: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204206.38205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 41016 1727204206.38225: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204206.38271: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204206.38555: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204206.38593: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204206.40570: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204206.40603: stderr chunk (state=3): >>><<< 41016 1727204206.40672: stdout chunk (state=3): >>><<< 41016 1727204206.40694: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204206.40717: handler run complete 41016 1727204206.40807: attempt loop complete, returning result 41016 1727204206.40833: _execute() done 41016 1727204206.40841: dumping result to json 41016 1727204206.40850: done dumping result, returning 41016 1727204206.41184: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [028d2410-947f-12d5-0ec4-000000000081] 41016 1727204206.41187: sending task result for task 028d2410-947f-12d5-0ec4-000000000081 41016 1727204206.41259: done sending task result for task 028d2410-947f-12d5-0ec4-000000000081 41016 1727204206.41262: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "ping": "pong" } 41016 1727204206.41354: no more pending results, returning what we have 41016 1727204206.41358: results queue empty 41016 1727204206.41360: checking for any_errors_fatal 41016 1727204206.41369: done checking for any_errors_fatal 41016 1727204206.41370: checking for max_fail_percentage 41016 1727204206.41372: done checking for max_fail_percentage 41016 1727204206.41373: checking to see if all hosts have failed and the running result is not ok 41016 1727204206.41374: done checking to see if all hosts have failed 41016 1727204206.41377: getting the remaining hosts for this loop 41016 1727204206.41379: done getting the remaining hosts for this loop 41016 1727204206.41383: getting the next task for host managed-node1 41016 1727204206.41394: done getting next task for host managed-node1 41016 1727204206.41397: ^ task is: TASK: meta (role_complete) 41016 1727204206.41400: ^ state is: HOST STATE: block=3, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204206.41415: getting variables 41016 1727204206.41417: in VariableManager get_vars() 41016 1727204206.41462: Calling all_inventory to load vars for managed-node1 41016 1727204206.41465: Calling groups_inventory to load vars for managed-node1 41016 1727204206.41468: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204206.41990: Calling all_plugins_play to load vars for managed-node1 41016 1727204206.41995: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204206.41999: Calling groups_plugins_play to load vars for managed-node1 41016 1727204206.44600: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204206.47826: done with get_vars() 41016 1727204206.47854: done getting variables 41016 1727204206.48143: done queuing things up, now waiting for results queue to drain 41016 1727204206.48145: results queue empty 41016 1727204206.48146: checking for any_errors_fatal 41016 1727204206.48149: done checking for any_errors_fatal 41016 1727204206.48150: checking for max_fail_percentage 41016 1727204206.48151: done checking for max_fail_percentage 41016 1727204206.48152: checking to see if all hosts have failed and the running result is not ok 41016 1727204206.48153: done checking to see if all hosts have failed 41016 1727204206.48154: getting the remaining hosts for this loop 41016 1727204206.48155: done getting the remaining hosts for this loop 41016 1727204206.48157: getting the next task for host managed-node1 41016 1727204206.48162: done getting next task for host managed-node1 41016 1727204206.48164: ^ task is: TASK: Assert that the warning about specifying the route without the output device is logged for initscripts provider 41016 1727204206.48166: ^ state is: HOST STATE: block=3, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204206.48168: getting variables 41016 1727204206.48169: in VariableManager get_vars() 41016 1727204206.48186: Calling all_inventory to load vars for managed-node1 41016 1727204206.48189: Calling groups_inventory to load vars for managed-node1 41016 1727204206.48191: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204206.48196: Calling all_plugins_play to load vars for managed-node1 41016 1727204206.48198: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204206.48201: Calling groups_plugins_play to load vars for managed-node1 41016 1727204206.49640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204206.51857: done with get_vars() 41016 1727204206.51883: done getting variables 41016 1727204206.51931: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the warning about specifying the route without the output device is logged for initscripts provider] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:122 Tuesday 24 September 2024 14:56:46 -0400 (0:00:00.477) 0:00:30.195 ***** 41016 1727204206.51958: entering _queue_task() for managed-node1/assert 41016 1727204206.52503: worker is 1 (out of 1 available) 41016 1727204206.52514: exiting _queue_task() for managed-node1/assert 41016 1727204206.52526: done queuing things up, now waiting for results queue to drain 41016 1727204206.52527: waiting for pending results... 41016 1727204206.52743: running TaskExecutor() for managed-node1/TASK: Assert that the warning about specifying the route without the output device is logged for initscripts provider 41016 1727204206.52868: in run() - task 028d2410-947f-12d5-0ec4-0000000000b1 41016 1727204206.52872: variable 'ansible_search_path' from source: unknown 41016 1727204206.52900: calling self._execute() 41016 1727204206.53084: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204206.53088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204206.53091: variable 'omit' from source: magic vars 41016 1727204206.53456: variable 'ansible_distribution_major_version' from source: facts 41016 1727204206.53473: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204206.53601: variable 'network_provider' from source: set_fact 41016 1727204206.53614: Evaluated conditional (network_provider == "initscripts"): False 41016 1727204206.53637: when evaluation is False, skipping this task 41016 1727204206.53647: _execute() done 41016 1727204206.53655: dumping result to json 41016 1727204206.53663: done dumping result, returning 41016 1727204206.53673: done running TaskExecutor() for managed-node1/TASK: Assert that the warning about specifying the route without the output device is logged for initscripts provider [028d2410-947f-12d5-0ec4-0000000000b1] 41016 1727204206.53782: sending task result for task 028d2410-947f-12d5-0ec4-0000000000b1 skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 41016 1727204206.54022: no more pending results, returning what we have 41016 1727204206.54026: results queue empty 41016 1727204206.54027: checking for any_errors_fatal 41016 1727204206.54029: done checking for any_errors_fatal 41016 1727204206.54030: checking for max_fail_percentage 41016 1727204206.54032: done checking for max_fail_percentage 41016 1727204206.54033: checking to see if all hosts have failed and the running result is not ok 41016 1727204206.54034: done checking to see if all hosts have failed 41016 1727204206.54034: getting the remaining hosts for this loop 41016 1727204206.54036: done getting the remaining hosts for this loop 41016 1727204206.54039: getting the next task for host managed-node1 41016 1727204206.54047: done getting next task for host managed-node1 41016 1727204206.54049: ^ task is: TASK: Assert that no warning is logged for nm provider 41016 1727204206.54052: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204206.54055: getting variables 41016 1727204206.54057: in VariableManager get_vars() 41016 1727204206.54112: Calling all_inventory to load vars for managed-node1 41016 1727204206.54116: Calling groups_inventory to load vars for managed-node1 41016 1727204206.54119: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204206.54208: Calling all_plugins_play to load vars for managed-node1 41016 1727204206.54215: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204206.54220: done sending task result for task 028d2410-947f-12d5-0ec4-0000000000b1 41016 1727204206.54223: WORKER PROCESS EXITING 41016 1727204206.54227: Calling groups_plugins_play to load vars for managed-node1 41016 1727204206.56432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204206.58125: done with get_vars() 41016 1727204206.58151: done getting variables 41016 1727204206.58215: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that no warning is logged for nm provider] ************************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:129 Tuesday 24 September 2024 14:56:46 -0400 (0:00:00.062) 0:00:30.258 ***** 41016 1727204206.58250: entering _queue_task() for managed-node1/assert 41016 1727204206.58697: worker is 1 (out of 1 available) 41016 1727204206.58813: exiting _queue_task() for managed-node1/assert 41016 1727204206.58829: done queuing things up, now waiting for results queue to drain 41016 1727204206.58831: waiting for pending results... 41016 1727204206.59016: running TaskExecutor() for managed-node1/TASK: Assert that no warning is logged for nm provider 41016 1727204206.59115: in run() - task 028d2410-947f-12d5-0ec4-0000000000b2 41016 1727204206.59168: variable 'ansible_search_path' from source: unknown 41016 1727204206.59185: calling self._execute() 41016 1727204206.59313: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204206.59385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204206.59389: variable 'omit' from source: magic vars 41016 1727204206.59745: variable 'ansible_distribution_major_version' from source: facts 41016 1727204206.59762: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204206.59890: variable 'network_provider' from source: set_fact 41016 1727204206.59902: Evaluated conditional (network_provider == "nm"): True 41016 1727204206.59917: variable 'omit' from source: magic vars 41016 1727204206.59949: variable 'omit' from source: magic vars 41016 1727204206.59992: variable 'omit' from source: magic vars 41016 1727204206.60045: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204206.60086: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204206.60146: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204206.60150: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204206.60156: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204206.60191: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204206.60199: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204206.60207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204206.60323: Set connection var ansible_shell_executable to /bin/sh 41016 1727204206.60363: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204206.60366: Set connection var ansible_shell_type to sh 41016 1727204206.60368: Set connection var ansible_timeout to 10 41016 1727204206.60370: Set connection var ansible_pipelining to False 41016 1727204206.60377: Set connection var ansible_connection to ssh 41016 1727204206.60403: variable 'ansible_shell_executable' from source: unknown 41016 1727204206.60415: variable 'ansible_connection' from source: unknown 41016 1727204206.60474: variable 'ansible_module_compression' from source: unknown 41016 1727204206.60480: variable 'ansible_shell_type' from source: unknown 41016 1727204206.60482: variable 'ansible_shell_executable' from source: unknown 41016 1727204206.60484: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204206.60486: variable 'ansible_pipelining' from source: unknown 41016 1727204206.60488: variable 'ansible_timeout' from source: unknown 41016 1727204206.60490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204206.60599: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204206.60621: variable 'omit' from source: magic vars 41016 1727204206.60634: starting attempt loop 41016 1727204206.60641: running the handler 41016 1727204206.60821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41016 1727204206.61133: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41016 1727204206.61281: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41016 1727204206.61542: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41016 1727204206.61580: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41016 1727204206.61673: variable '__network_connections_result' from source: set_fact 41016 1727204206.61698: Evaluated conditional (__network_connections_result.stderr is not search("")): True 41016 1727204206.61704: handler run complete 41016 1727204206.61718: attempt loop complete, returning result 41016 1727204206.61728: _execute() done 41016 1727204206.61731: dumping result to json 41016 1727204206.61733: done dumping result, returning 41016 1727204206.61738: done running TaskExecutor() for managed-node1/TASK: Assert that no warning is logged for nm provider [028d2410-947f-12d5-0ec4-0000000000b2] 41016 1727204206.61780: sending task result for task 028d2410-947f-12d5-0ec4-0000000000b2 41016 1727204206.62097: done sending task result for task 028d2410-947f-12d5-0ec4-0000000000b2 41016 1727204206.62101: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 41016 1727204206.62144: no more pending results, returning what we have 41016 1727204206.62147: results queue empty 41016 1727204206.62148: checking for any_errors_fatal 41016 1727204206.62154: done checking for any_errors_fatal 41016 1727204206.62155: checking for max_fail_percentage 41016 1727204206.62157: done checking for max_fail_percentage 41016 1727204206.62158: checking to see if all hosts have failed and the running result is not ok 41016 1727204206.62159: done checking to see if all hosts have failed 41016 1727204206.62160: getting the remaining hosts for this loop 41016 1727204206.62161: done getting the remaining hosts for this loop 41016 1727204206.62165: getting the next task for host managed-node1 41016 1727204206.62173: done getting next task for host managed-node1 41016 1727204206.62177: ^ task is: TASK: Bring down test devices and profiles 41016 1727204206.62181: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204206.62185: getting variables 41016 1727204206.62187: in VariableManager get_vars() 41016 1727204206.62234: Calling all_inventory to load vars for managed-node1 41016 1727204206.62237: Calling groups_inventory to load vars for managed-node1 41016 1727204206.62240: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204206.62249: Calling all_plugins_play to load vars for managed-node1 41016 1727204206.62253: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204206.62256: Calling groups_plugins_play to load vars for managed-node1 41016 1727204206.68967: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204206.70572: done with get_vars() 41016 1727204206.70598: done getting variables TASK [Bring down test devices and profiles] ************************************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:140 Tuesday 24 September 2024 14:56:46 -0400 (0:00:00.124) 0:00:30.382 ***** 41016 1727204206.70695: entering _queue_task() for managed-node1/include_role 41016 1727204206.70697: Creating lock for include_role 41016 1727204206.71307: worker is 1 (out of 1 available) 41016 1727204206.71319: exiting _queue_task() for managed-node1/include_role 41016 1727204206.71329: done queuing things up, now waiting for results queue to drain 41016 1727204206.71331: waiting for pending results... 41016 1727204206.71453: running TaskExecutor() for managed-node1/TASK: Bring down test devices and profiles 41016 1727204206.71613: in run() - task 028d2410-947f-12d5-0ec4-0000000000b4 41016 1727204206.71637: variable 'ansible_search_path' from source: unknown 41016 1727204206.71689: calling self._execute() 41016 1727204206.71797: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204206.71885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204206.71889: variable 'omit' from source: magic vars 41016 1727204206.72281: variable 'ansible_distribution_major_version' from source: facts 41016 1727204206.72320: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204206.72330: _execute() done 41016 1727204206.72334: dumping result to json 41016 1727204206.72380: done dumping result, returning 41016 1727204206.72384: done running TaskExecutor() for managed-node1/TASK: Bring down test devices and profiles [028d2410-947f-12d5-0ec4-0000000000b4] 41016 1727204206.72387: sending task result for task 028d2410-947f-12d5-0ec4-0000000000b4 41016 1727204206.72645: no more pending results, returning what we have 41016 1727204206.72652: in VariableManager get_vars() 41016 1727204206.72705: Calling all_inventory to load vars for managed-node1 41016 1727204206.72708: Calling groups_inventory to load vars for managed-node1 41016 1727204206.72713: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204206.72725: Calling all_plugins_play to load vars for managed-node1 41016 1727204206.72728: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204206.72732: Calling groups_plugins_play to load vars for managed-node1 41016 1727204206.73359: done sending task result for task 028d2410-947f-12d5-0ec4-0000000000b4 41016 1727204206.73363: WORKER PROCESS EXITING 41016 1727204206.74342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204206.76087: done with get_vars() 41016 1727204206.76104: variable 'ansible_search_path' from source: unknown 41016 1727204206.76397: variable 'omit' from source: magic vars 41016 1727204206.76433: variable 'omit' from source: magic vars 41016 1727204206.76450: variable 'omit' from source: magic vars 41016 1727204206.76453: we have included files to process 41016 1727204206.76454: generating all_blocks data 41016 1727204206.76458: done generating all_blocks data 41016 1727204206.76467: processing included file: fedora.linux_system_roles.network 41016 1727204206.76491: in VariableManager get_vars() 41016 1727204206.76509: done with get_vars() 41016 1727204206.76540: in VariableManager get_vars() 41016 1727204206.76560: done with get_vars() 41016 1727204206.76614: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 41016 1727204206.76739: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 41016 1727204206.76826: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 41016 1727204206.77308: in VariableManager get_vars() 41016 1727204206.77339: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 41016 1727204206.79369: iterating over new_blocks loaded from include file 41016 1727204206.79372: in VariableManager get_vars() 41016 1727204206.79396: done with get_vars() 41016 1727204206.79398: filtering new block on tags 41016 1727204206.79653: done filtering new block on tags 41016 1727204206.79657: in VariableManager get_vars() 41016 1727204206.79674: done with get_vars() 41016 1727204206.79677: filtering new block on tags 41016 1727204206.79694: done filtering new block on tags 41016 1727204206.79696: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed-node1 41016 1727204206.79701: extending task lists for all hosts with included blocks 41016 1727204206.79927: done extending task lists 41016 1727204206.79929: done processing included files 41016 1727204206.79930: results queue empty 41016 1727204206.79930: checking for any_errors_fatal 41016 1727204206.79937: done checking for any_errors_fatal 41016 1727204206.79940: checking for max_fail_percentage 41016 1727204206.79941: done checking for max_fail_percentage 41016 1727204206.79942: checking to see if all hosts have failed and the running result is not ok 41016 1727204206.79943: done checking to see if all hosts have failed 41016 1727204206.79944: getting the remaining hosts for this loop 41016 1727204206.79945: done getting the remaining hosts for this loop 41016 1727204206.79948: getting the next task for host managed-node1 41016 1727204206.79952: done getting next task for host managed-node1 41016 1727204206.79955: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 41016 1727204206.79957: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204206.79967: getting variables 41016 1727204206.79968: in VariableManager get_vars() 41016 1727204206.79985: Calling all_inventory to load vars for managed-node1 41016 1727204206.79987: Calling groups_inventory to load vars for managed-node1 41016 1727204206.79989: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204206.79995: Calling all_plugins_play to load vars for managed-node1 41016 1727204206.79997: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204206.80000: Calling groups_plugins_play to load vars for managed-node1 41016 1727204206.81224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204206.82897: done with get_vars() 41016 1727204206.82926: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:56:46 -0400 (0:00:00.123) 0:00:30.506 ***** 41016 1727204206.83005: entering _queue_task() for managed-node1/include_tasks 41016 1727204206.83384: worker is 1 (out of 1 available) 41016 1727204206.83396: exiting _queue_task() for managed-node1/include_tasks 41016 1727204206.83521: done queuing things up, now waiting for results queue to drain 41016 1727204206.83523: waiting for pending results... 41016 1727204206.83723: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 41016 1727204206.83880: in run() - task 028d2410-947f-12d5-0ec4-000000000641 41016 1727204206.83951: variable 'ansible_search_path' from source: unknown 41016 1727204206.83955: variable 'ansible_search_path' from source: unknown 41016 1727204206.83960: calling self._execute() 41016 1727204206.84061: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204206.84079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204206.84094: variable 'omit' from source: magic vars 41016 1727204206.84529: variable 'ansible_distribution_major_version' from source: facts 41016 1727204206.84581: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204206.84585: _execute() done 41016 1727204206.84588: dumping result to json 41016 1727204206.84590: done dumping result, returning 41016 1727204206.84599: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [028d2410-947f-12d5-0ec4-000000000641] 41016 1727204206.84602: sending task result for task 028d2410-947f-12d5-0ec4-000000000641 41016 1727204206.84861: no more pending results, returning what we have 41016 1727204206.84868: in VariableManager get_vars() 41016 1727204206.84979: Calling all_inventory to load vars for managed-node1 41016 1727204206.84984: Calling groups_inventory to load vars for managed-node1 41016 1727204206.84987: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204206.84994: done sending task result for task 028d2410-947f-12d5-0ec4-000000000641 41016 1727204206.84997: WORKER PROCESS EXITING 41016 1727204206.85014: Calling all_plugins_play to load vars for managed-node1 41016 1727204206.85018: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204206.85021: Calling groups_plugins_play to load vars for managed-node1 41016 1727204206.86066: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204206.86943: done with get_vars() 41016 1727204206.86966: variable 'ansible_search_path' from source: unknown 41016 1727204206.86967: variable 'ansible_search_path' from source: unknown 41016 1727204206.87001: we have included files to process 41016 1727204206.87002: generating all_blocks data 41016 1727204206.87004: done generating all_blocks data 41016 1727204206.87005: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41016 1727204206.87006: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41016 1727204206.87007: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 41016 1727204206.87596: done processing included file 41016 1727204206.87598: iterating over new_blocks loaded from include file 41016 1727204206.87599: in VariableManager get_vars() 41016 1727204206.87627: done with get_vars() 41016 1727204206.87629: filtering new block on tags 41016 1727204206.87656: done filtering new block on tags 41016 1727204206.87659: in VariableManager get_vars() 41016 1727204206.87684: done with get_vars() 41016 1727204206.87686: filtering new block on tags 41016 1727204206.87726: done filtering new block on tags 41016 1727204206.87729: in VariableManager get_vars() 41016 1727204206.87752: done with get_vars() 41016 1727204206.87753: filtering new block on tags 41016 1727204206.87790: done filtering new block on tags 41016 1727204206.87792: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node1 41016 1727204206.87797: extending task lists for all hosts with included blocks 41016 1727204206.88549: done extending task lists 41016 1727204206.88550: done processing included files 41016 1727204206.88551: results queue empty 41016 1727204206.88551: checking for any_errors_fatal 41016 1727204206.88553: done checking for any_errors_fatal 41016 1727204206.88554: checking for max_fail_percentage 41016 1727204206.88554: done checking for max_fail_percentage 41016 1727204206.88555: checking to see if all hosts have failed and the running result is not ok 41016 1727204206.88556: done checking to see if all hosts have failed 41016 1727204206.88556: getting the remaining hosts for this loop 41016 1727204206.88557: done getting the remaining hosts for this loop 41016 1727204206.88558: getting the next task for host managed-node1 41016 1727204206.88561: done getting next task for host managed-node1 41016 1727204206.88564: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 41016 1727204206.88567: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204206.88574: getting variables 41016 1727204206.88577: in VariableManager get_vars() 41016 1727204206.88588: Calling all_inventory to load vars for managed-node1 41016 1727204206.88590: Calling groups_inventory to load vars for managed-node1 41016 1727204206.88591: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204206.88595: Calling all_plugins_play to load vars for managed-node1 41016 1727204206.88597: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204206.88598: Calling groups_plugins_play to load vars for managed-node1 41016 1727204206.89281: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204206.90197: done with get_vars() 41016 1727204206.90225: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:56:46 -0400 (0:00:00.073) 0:00:30.579 ***** 41016 1727204206.90313: entering _queue_task() for managed-node1/setup 41016 1727204206.90691: worker is 1 (out of 1 available) 41016 1727204206.90707: exiting _queue_task() for managed-node1/setup 41016 1727204206.90722: done queuing things up, now waiting for results queue to drain 41016 1727204206.90724: waiting for pending results... 41016 1727204206.91040: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 41016 1727204206.91186: in run() - task 028d2410-947f-12d5-0ec4-0000000006a7 41016 1727204206.91202: variable 'ansible_search_path' from source: unknown 41016 1727204206.91207: variable 'ansible_search_path' from source: unknown 41016 1727204206.91241: calling self._execute() 41016 1727204206.91343: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204206.91347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204206.91356: variable 'omit' from source: magic vars 41016 1727204206.91802: variable 'ansible_distribution_major_version' from source: facts 41016 1727204206.91811: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204206.91966: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41016 1727204206.93782: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41016 1727204206.93787: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41016 1727204206.93789: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41016 1727204206.93792: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41016 1727204206.93795: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41016 1727204206.93797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204206.93800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204206.93802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204206.93848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204206.93860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204206.93918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204206.93940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204206.93961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204206.94007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204206.94010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204206.94167: variable '__network_required_facts' from source: role '' defaults 41016 1727204206.94173: variable 'ansible_facts' from source: unknown 41016 1727204206.94780: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 41016 1727204206.94786: when evaluation is False, skipping this task 41016 1727204206.94789: _execute() done 41016 1727204206.94791: dumping result to json 41016 1727204206.94793: done dumping result, returning 41016 1727204206.94803: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [028d2410-947f-12d5-0ec4-0000000006a7] 41016 1727204206.94806: sending task result for task 028d2410-947f-12d5-0ec4-0000000006a7 41016 1727204206.94893: done sending task result for task 028d2410-947f-12d5-0ec4-0000000006a7 41016 1727204206.94896: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41016 1727204206.94946: no more pending results, returning what we have 41016 1727204206.94951: results queue empty 41016 1727204206.94951: checking for any_errors_fatal 41016 1727204206.94953: done checking for any_errors_fatal 41016 1727204206.94954: checking for max_fail_percentage 41016 1727204206.94955: done checking for max_fail_percentage 41016 1727204206.94956: checking to see if all hosts have failed and the running result is not ok 41016 1727204206.94957: done checking to see if all hosts have failed 41016 1727204206.94957: getting the remaining hosts for this loop 41016 1727204206.94958: done getting the remaining hosts for this loop 41016 1727204206.94962: getting the next task for host managed-node1 41016 1727204206.94971: done getting next task for host managed-node1 41016 1727204206.94975: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 41016 1727204206.95030: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204206.95048: getting variables 41016 1727204206.95050: in VariableManager get_vars() 41016 1727204206.95089: Calling all_inventory to load vars for managed-node1 41016 1727204206.95092: Calling groups_inventory to load vars for managed-node1 41016 1727204206.95094: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204206.95103: Calling all_plugins_play to load vars for managed-node1 41016 1727204206.95106: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204206.95108: Calling groups_plugins_play to load vars for managed-node1 41016 1727204206.96295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204206.97176: done with get_vars() 41016 1727204206.97193: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:56:46 -0400 (0:00:00.069) 0:00:30.648 ***** 41016 1727204206.97269: entering _queue_task() for managed-node1/stat 41016 1727204206.97500: worker is 1 (out of 1 available) 41016 1727204206.97516: exiting _queue_task() for managed-node1/stat 41016 1727204206.97527: done queuing things up, now waiting for results queue to drain 41016 1727204206.97529: waiting for pending results... 41016 1727204206.97898: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 41016 1727204206.97903: in run() - task 028d2410-947f-12d5-0ec4-0000000006a9 41016 1727204206.97907: variable 'ansible_search_path' from source: unknown 41016 1727204206.97910: variable 'ansible_search_path' from source: unknown 41016 1727204206.98082: calling self._execute() 41016 1727204206.98086: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204206.98090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204206.98092: variable 'omit' from source: magic vars 41016 1727204206.98433: variable 'ansible_distribution_major_version' from source: facts 41016 1727204206.98444: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204206.98630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41016 1727204206.98890: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41016 1727204206.98926: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41016 1727204206.98956: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41016 1727204206.98983: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41016 1727204206.99050: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41016 1727204206.99065: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41016 1727204206.99092: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204206.99113: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41016 1727204206.99173: variable '__network_is_ostree' from source: set_fact 41016 1727204206.99180: Evaluated conditional (not __network_is_ostree is defined): False 41016 1727204206.99183: when evaluation is False, skipping this task 41016 1727204206.99186: _execute() done 41016 1727204206.99189: dumping result to json 41016 1727204206.99193: done dumping result, returning 41016 1727204206.99200: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [028d2410-947f-12d5-0ec4-0000000006a9] 41016 1727204206.99205: sending task result for task 028d2410-947f-12d5-0ec4-0000000006a9 41016 1727204206.99383: done sending task result for task 028d2410-947f-12d5-0ec4-0000000006a9 41016 1727204206.99386: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 41016 1727204206.99598: no more pending results, returning what we have 41016 1727204206.99602: results queue empty 41016 1727204206.99604: checking for any_errors_fatal 41016 1727204206.99609: done checking for any_errors_fatal 41016 1727204206.99612: checking for max_fail_percentage 41016 1727204206.99613: done checking for max_fail_percentage 41016 1727204206.99614: checking to see if all hosts have failed and the running result is not ok 41016 1727204206.99615: done checking to see if all hosts have failed 41016 1727204206.99616: getting the remaining hosts for this loop 41016 1727204206.99617: done getting the remaining hosts for this loop 41016 1727204206.99620: getting the next task for host managed-node1 41016 1727204206.99627: done getting next task for host managed-node1 41016 1727204206.99630: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 41016 1727204206.99635: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204206.99651: getting variables 41016 1727204206.99652: in VariableManager get_vars() 41016 1727204206.99692: Calling all_inventory to load vars for managed-node1 41016 1727204206.99695: Calling groups_inventory to load vars for managed-node1 41016 1727204206.99697: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204206.99706: Calling all_plugins_play to load vars for managed-node1 41016 1727204206.99709: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204206.99714: Calling groups_plugins_play to load vars for managed-node1 41016 1727204207.01298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204207.03006: done with get_vars() 41016 1727204207.03036: done getting variables 41016 1727204207.03115: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:56:47 -0400 (0:00:00.058) 0:00:30.707 ***** 41016 1727204207.03153: entering _queue_task() for managed-node1/set_fact 41016 1727204207.03540: worker is 1 (out of 1 available) 41016 1727204207.03553: exiting _queue_task() for managed-node1/set_fact 41016 1727204207.03565: done queuing things up, now waiting for results queue to drain 41016 1727204207.03566: waiting for pending results... 41016 1727204207.03743: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 41016 1727204207.03855: in run() - task 028d2410-947f-12d5-0ec4-0000000006aa 41016 1727204207.03867: variable 'ansible_search_path' from source: unknown 41016 1727204207.03871: variable 'ansible_search_path' from source: unknown 41016 1727204207.03901: calling self._execute() 41016 1727204207.03973: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204207.03979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204207.03988: variable 'omit' from source: magic vars 41016 1727204207.04274: variable 'ansible_distribution_major_version' from source: facts 41016 1727204207.04284: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204207.04404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41016 1727204207.04597: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41016 1727204207.04632: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41016 1727204207.04656: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41016 1727204207.04684: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41016 1727204207.04747: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41016 1727204207.04766: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41016 1727204207.04787: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204207.04807: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41016 1727204207.04871: variable '__network_is_ostree' from source: set_fact 41016 1727204207.04881: Evaluated conditional (not __network_is_ostree is defined): False 41016 1727204207.04885: when evaluation is False, skipping this task 41016 1727204207.04887: _execute() done 41016 1727204207.04890: dumping result to json 41016 1727204207.04892: done dumping result, returning 41016 1727204207.04898: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [028d2410-947f-12d5-0ec4-0000000006aa] 41016 1727204207.04904: sending task result for task 028d2410-947f-12d5-0ec4-0000000006aa 41016 1727204207.04990: done sending task result for task 028d2410-947f-12d5-0ec4-0000000006aa 41016 1727204207.04993: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 41016 1727204207.05038: no more pending results, returning what we have 41016 1727204207.05042: results queue empty 41016 1727204207.05043: checking for any_errors_fatal 41016 1727204207.05048: done checking for any_errors_fatal 41016 1727204207.05049: checking for max_fail_percentage 41016 1727204207.05051: done checking for max_fail_percentage 41016 1727204207.05051: checking to see if all hosts have failed and the running result is not ok 41016 1727204207.05052: done checking to see if all hosts have failed 41016 1727204207.05053: getting the remaining hosts for this loop 41016 1727204207.05054: done getting the remaining hosts for this loop 41016 1727204207.05058: getting the next task for host managed-node1 41016 1727204207.05067: done getting next task for host managed-node1 41016 1727204207.05071: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 41016 1727204207.05077: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204207.05098: getting variables 41016 1727204207.05099: in VariableManager get_vars() 41016 1727204207.05140: Calling all_inventory to load vars for managed-node1 41016 1727204207.05143: Calling groups_inventory to load vars for managed-node1 41016 1727204207.05145: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204207.05154: Calling all_plugins_play to load vars for managed-node1 41016 1727204207.05156: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204207.05159: Calling groups_plugins_play to load vars for managed-node1 41016 1727204207.05935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204207.06799: done with get_vars() 41016 1727204207.06817: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:56:47 -0400 (0:00:00.037) 0:00:30.744 ***** 41016 1727204207.06882: entering _queue_task() for managed-node1/service_facts 41016 1727204207.07113: worker is 1 (out of 1 available) 41016 1727204207.07125: exiting _queue_task() for managed-node1/service_facts 41016 1727204207.07137: done queuing things up, now waiting for results queue to drain 41016 1727204207.07139: waiting for pending results... 41016 1727204207.07318: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running 41016 1727204207.07418: in run() - task 028d2410-947f-12d5-0ec4-0000000006ac 41016 1727204207.07430: variable 'ansible_search_path' from source: unknown 41016 1727204207.07434: variable 'ansible_search_path' from source: unknown 41016 1727204207.07460: calling self._execute() 41016 1727204207.07534: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204207.07539: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204207.07547: variable 'omit' from source: magic vars 41016 1727204207.07831: variable 'ansible_distribution_major_version' from source: facts 41016 1727204207.07840: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204207.07846: variable 'omit' from source: magic vars 41016 1727204207.07894: variable 'omit' from source: magic vars 41016 1727204207.07923: variable 'omit' from source: magic vars 41016 1727204207.07953: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204207.07980: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204207.07995: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204207.08007: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204207.08021: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204207.08044: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204207.08047: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204207.08050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204207.08121: Set connection var ansible_shell_executable to /bin/sh 41016 1727204207.08125: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204207.08134: Set connection var ansible_shell_type to sh 41016 1727204207.08137: Set connection var ansible_timeout to 10 41016 1727204207.08139: Set connection var ansible_pipelining to False 41016 1727204207.08146: Set connection var ansible_connection to ssh 41016 1727204207.08162: variable 'ansible_shell_executable' from source: unknown 41016 1727204207.08165: variable 'ansible_connection' from source: unknown 41016 1727204207.08168: variable 'ansible_module_compression' from source: unknown 41016 1727204207.08170: variable 'ansible_shell_type' from source: unknown 41016 1727204207.08172: variable 'ansible_shell_executable' from source: unknown 41016 1727204207.08174: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204207.08178: variable 'ansible_pipelining' from source: unknown 41016 1727204207.08180: variable 'ansible_timeout' from source: unknown 41016 1727204207.08185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204207.08326: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 41016 1727204207.08334: variable 'omit' from source: magic vars 41016 1727204207.08340: starting attempt loop 41016 1727204207.08345: running the handler 41016 1727204207.08357: _low_level_execute_command(): starting 41016 1727204207.08364: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204207.08848: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204207.08879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204207.08883: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 41016 1727204207.08885: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204207.08888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204207.08928: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204207.08941: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204207.09039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204207.10829: stdout chunk (state=3): >>>/root <<< 41016 1727204207.10925: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204207.10956: stderr chunk (state=3): >>><<< 41016 1727204207.10959: stdout chunk (state=3): >>><<< 41016 1727204207.10978: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204207.10989: _low_level_execute_command(): starting 41016 1727204207.10995: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204207.1097727-43111-32981602644056 `" && echo ansible-tmp-1727204207.1097727-43111-32981602644056="` echo /root/.ansible/tmp/ansible-tmp-1727204207.1097727-43111-32981602644056 `" ) && sleep 0' 41016 1727204207.11428: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204207.11431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 41016 1727204207.11433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204207.11442: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204207.11444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204207.11446: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204207.11481: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204207.11506: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204207.11508: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204207.11579: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204207.13686: stdout chunk (state=3): >>>ansible-tmp-1727204207.1097727-43111-32981602644056=/root/.ansible/tmp/ansible-tmp-1727204207.1097727-43111-32981602644056 <<< 41016 1727204207.13797: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204207.13822: stderr chunk (state=3): >>><<< 41016 1727204207.13825: stdout chunk (state=3): >>><<< 41016 1727204207.13839: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204207.1097727-43111-32981602644056=/root/.ansible/tmp/ansible-tmp-1727204207.1097727-43111-32981602644056 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204207.13877: variable 'ansible_module_compression' from source: unknown 41016 1727204207.13915: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-410168h8uvyln/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 41016 1727204207.13943: variable 'ansible_facts' from source: unknown 41016 1727204207.13997: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204207.1097727-43111-32981602644056/AnsiballZ_service_facts.py 41016 1727204207.14093: Sending initial data 41016 1727204207.14097: Sent initial data (161 bytes) 41016 1727204207.14532: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204207.14536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 41016 1727204207.14538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 41016 1727204207.14540: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204207.14543: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204207.14587: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204207.14590: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204207.14592: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204207.14674: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204207.16384: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 41016 1727204207.16388: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204207.16457: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204207.16531: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmpywslvulr /root/.ansible/tmp/ansible-tmp-1727204207.1097727-43111-32981602644056/AnsiballZ_service_facts.py <<< 41016 1727204207.16535: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204207.1097727-43111-32981602644056/AnsiballZ_service_facts.py" <<< 41016 1727204207.16603: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmpywslvulr" to remote "/root/.ansible/tmp/ansible-tmp-1727204207.1097727-43111-32981602644056/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204207.1097727-43111-32981602644056/AnsiballZ_service_facts.py" <<< 41016 1727204207.17273: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204207.17312: stderr chunk (state=3): >>><<< 41016 1727204207.17316: stdout chunk (state=3): >>><<< 41016 1727204207.17351: done transferring module to remote 41016 1727204207.17360: _low_level_execute_command(): starting 41016 1727204207.17363: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204207.1097727-43111-32981602644056/ /root/.ansible/tmp/ansible-tmp-1727204207.1097727-43111-32981602644056/AnsiballZ_service_facts.py && sleep 0' 41016 1727204207.17765: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204207.17800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 41016 1727204207.17803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204207.17805: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204207.17811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204207.17855: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204207.17858: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204207.17941: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204207.19886: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204207.19910: stderr chunk (state=3): >>><<< 41016 1727204207.19915: stdout chunk (state=3): >>><<< 41016 1727204207.19928: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204207.19933: _low_level_execute_command(): starting 41016 1727204207.19938: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204207.1097727-43111-32981602644056/AnsiballZ_service_facts.py && sleep 0' 41016 1727204207.20337: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204207.20365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204207.20370: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204207.20373: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204207.20376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204207.20428: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204207.20432: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204207.20519: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204208.96744: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 41016 1727204208.98406: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 41016 1727204208.98452: stderr chunk (state=3): >>><<< 41016 1727204208.98782: stdout chunk (state=3): >>><<< 41016 1727204208.98788: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204209.00580: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204207.1097727-43111-32981602644056/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204209.00623: _low_level_execute_command(): starting 41016 1727204209.00689: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204207.1097727-43111-32981602644056/ > /dev/null 2>&1 && sleep 0' 41016 1727204209.01712: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 41016 1727204209.01789: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204209.02020: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204209.02094: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204209.04071: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204209.04331: stderr chunk (state=3): >>><<< 41016 1727204209.04335: stdout chunk (state=3): >>><<< 41016 1727204209.04419: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204209.04423: handler run complete 41016 1727204209.04665: variable 'ansible_facts' from source: unknown 41016 1727204209.05021: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204209.05900: variable 'ansible_facts' from source: unknown 41016 1727204209.06328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204209.06633: attempt loop complete, returning result 41016 1727204209.06636: _execute() done 41016 1727204209.06641: dumping result to json 41016 1727204209.06705: done dumping result, returning 41016 1727204209.06714: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running [028d2410-947f-12d5-0ec4-0000000006ac] 41016 1727204209.06717: sending task result for task 028d2410-947f-12d5-0ec4-0000000006ac 41016 1727204209.08584: done sending task result for task 028d2410-947f-12d5-0ec4-0000000006ac 41016 1727204209.08587: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41016 1727204209.08700: no more pending results, returning what we have 41016 1727204209.08703: results queue empty 41016 1727204209.08704: checking for any_errors_fatal 41016 1727204209.08707: done checking for any_errors_fatal 41016 1727204209.08708: checking for max_fail_percentage 41016 1727204209.08711: done checking for max_fail_percentage 41016 1727204209.08712: checking to see if all hosts have failed and the running result is not ok 41016 1727204209.08713: done checking to see if all hosts have failed 41016 1727204209.08713: getting the remaining hosts for this loop 41016 1727204209.08715: done getting the remaining hosts for this loop 41016 1727204209.08718: getting the next task for host managed-node1 41016 1727204209.08723: done getting next task for host managed-node1 41016 1727204209.08727: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 41016 1727204209.08734: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204209.08743: getting variables 41016 1727204209.08745: in VariableManager get_vars() 41016 1727204209.08779: Calling all_inventory to load vars for managed-node1 41016 1727204209.08782: Calling groups_inventory to load vars for managed-node1 41016 1727204209.08784: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204209.08792: Calling all_plugins_play to load vars for managed-node1 41016 1727204209.08795: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204209.08797: Calling groups_plugins_play to load vars for managed-node1 41016 1727204209.11860: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204209.16690: done with get_vars() 41016 1727204209.16719: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:56:49 -0400 (0:00:02.099) 0:00:32.844 ***** 41016 1727204209.16826: entering _queue_task() for managed-node1/package_facts 41016 1727204209.17590: worker is 1 (out of 1 available) 41016 1727204209.17601: exiting _queue_task() for managed-node1/package_facts 41016 1727204209.17617: done queuing things up, now waiting for results queue to drain 41016 1727204209.17619: waiting for pending results... 41016 1727204209.18160: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 41016 1727204209.18486: in run() - task 028d2410-947f-12d5-0ec4-0000000006ad 41016 1727204209.18501: variable 'ansible_search_path' from source: unknown 41016 1727204209.18505: variable 'ansible_search_path' from source: unknown 41016 1727204209.18582: calling self._execute() 41016 1727204209.18623: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204209.18627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204209.18639: variable 'omit' from source: magic vars 41016 1727204209.19411: variable 'ansible_distribution_major_version' from source: facts 41016 1727204209.19420: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204209.19427: variable 'omit' from source: magic vars 41016 1727204209.19779: variable 'omit' from source: magic vars 41016 1727204209.19782: variable 'omit' from source: magic vars 41016 1727204209.19785: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204209.19804: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204209.19823: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204209.19840: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204209.19851: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204209.19882: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204209.20088: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204209.20092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204209.20190: Set connection var ansible_shell_executable to /bin/sh 41016 1727204209.20196: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204209.20202: Set connection var ansible_shell_type to sh 41016 1727204209.20212: Set connection var ansible_timeout to 10 41016 1727204209.20214: Set connection var ansible_pipelining to False 41016 1727204209.20219: Set connection var ansible_connection to ssh 41016 1727204209.20319: variable 'ansible_shell_executable' from source: unknown 41016 1727204209.20323: variable 'ansible_connection' from source: unknown 41016 1727204209.20325: variable 'ansible_module_compression' from source: unknown 41016 1727204209.20327: variable 'ansible_shell_type' from source: unknown 41016 1727204209.20329: variable 'ansible_shell_executable' from source: unknown 41016 1727204209.20331: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204209.20332: variable 'ansible_pipelining' from source: unknown 41016 1727204209.20334: variable 'ansible_timeout' from source: unknown 41016 1727204209.20336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204209.20648: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 41016 1727204209.20657: variable 'omit' from source: magic vars 41016 1727204209.21037: starting attempt loop 41016 1727204209.21040: running the handler 41016 1727204209.21043: _low_level_execute_command(): starting 41016 1727204209.21045: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204209.22222: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204209.22365: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204209.22391: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204209.22404: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204209.22534: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204209.24280: stdout chunk (state=3): >>>/root <<< 41016 1727204209.24374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204209.24408: stderr chunk (state=3): >>><<< 41016 1727204209.24417: stdout chunk (state=3): >>><<< 41016 1727204209.24584: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204209.24588: _low_level_execute_command(): starting 41016 1727204209.24591: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204209.2449057-43175-198984598242305 `" && echo ansible-tmp-1727204209.2449057-43175-198984598242305="` echo /root/.ansible/tmp/ansible-tmp-1727204209.2449057-43175-198984598242305 `" ) && sleep 0' 41016 1727204209.25883: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204209.25967: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204209.25982: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204209.25992: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204209.26115: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204209.28194: stdout chunk (state=3): >>>ansible-tmp-1727204209.2449057-43175-198984598242305=/root/.ansible/tmp/ansible-tmp-1727204209.2449057-43175-198984598242305 <<< 41016 1727204209.28406: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204209.28520: stderr chunk (state=3): >>><<< 41016 1727204209.28524: stdout chunk (state=3): >>><<< 41016 1727204209.28540: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204209.2449057-43175-198984598242305=/root/.ansible/tmp/ansible-tmp-1727204209.2449057-43175-198984598242305 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204209.28697: variable 'ansible_module_compression' from source: unknown 41016 1727204209.28701: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-410168h8uvyln/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 41016 1727204209.28982: variable 'ansible_facts' from source: unknown 41016 1727204209.29641: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204209.2449057-43175-198984598242305/AnsiballZ_package_facts.py 41016 1727204209.30174: Sending initial data 41016 1727204209.30181: Sent initial data (162 bytes) 41016 1727204209.31395: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204209.31410: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204209.31597: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204209.33405: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204209.33481: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204209.33558: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmpoixt7y2l /root/.ansible/tmp/ansible-tmp-1727204209.2449057-43175-198984598242305/AnsiballZ_package_facts.py <<< 41016 1727204209.33571: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204209.2449057-43175-198984598242305/AnsiballZ_package_facts.py" <<< 41016 1727204209.33696: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmpoixt7y2l" to remote "/root/.ansible/tmp/ansible-tmp-1727204209.2449057-43175-198984598242305/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204209.2449057-43175-198984598242305/AnsiballZ_package_facts.py" <<< 41016 1727204209.36587: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204209.36644: stderr chunk (state=3): >>><<< 41016 1727204209.36652: stdout chunk (state=3): >>><<< 41016 1727204209.36707: done transferring module to remote 41016 1727204209.36792: _low_level_execute_command(): starting 41016 1727204209.36800: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204209.2449057-43175-198984598242305/ /root/.ansible/tmp/ansible-tmp-1727204209.2449057-43175-198984598242305/AnsiballZ_package_facts.py && sleep 0' 41016 1727204209.37919: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204209.37936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 41016 1727204209.37940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204209.38042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204209.38147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204209.38172: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204209.38178: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204209.38366: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204209.38473: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204209.40392: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204209.40429: stderr chunk (state=3): >>><<< 41016 1727204209.40684: stdout chunk (state=3): >>><<< 41016 1727204209.40688: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204209.40690: _low_level_execute_command(): starting 41016 1727204209.40693: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204209.2449057-43175-198984598242305/AnsiballZ_package_facts.py && sleep 0' 41016 1727204209.41892: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204209.42039: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204209.42061: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204209.42392: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204209.89117: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 41016 1727204209.89168: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 41016 1727204209.89289: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name":<<< 41016 1727204209.89380: stdout chunk (state=3): >>> "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 41016 1727204209.91402: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 41016 1727204209.91485: stderr chunk (state=3): >>><<< 41016 1727204209.91488: stdout chunk (state=3): >>><<< 41016 1727204209.91764: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204209.96165: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204209.2449057-43175-198984598242305/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204209.96308: _low_level_execute_command(): starting 41016 1727204209.96323: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204209.2449057-43175-198984598242305/ > /dev/null 2>&1 && sleep 0' 41016 1727204209.97718: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204209.97723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204209.97792: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204209.97813: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204209.97872: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204209.97972: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204209.99966: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204209.99981: stdout chunk (state=3): >>><<< 41016 1727204209.99994: stderr chunk (state=3): >>><<< 41016 1727204210.00015: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204210.00027: handler run complete 41016 1727204210.00916: variable 'ansible_facts' from source: unknown 41016 1727204210.01359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204210.03772: variable 'ansible_facts' from source: unknown 41016 1727204210.04381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204210.04956: attempt loop complete, returning result 41016 1727204210.04973: _execute() done 41016 1727204210.04984: dumping result to json 41016 1727204210.05190: done dumping result, returning 41016 1727204210.05204: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [028d2410-947f-12d5-0ec4-0000000006ad] 41016 1727204210.05213: sending task result for task 028d2410-947f-12d5-0ec4-0000000006ad 41016 1727204210.07606: done sending task result for task 028d2410-947f-12d5-0ec4-0000000006ad 41016 1727204210.07609: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41016 1727204210.07771: no more pending results, returning what we have 41016 1727204210.07774: results queue empty 41016 1727204210.07777: checking for any_errors_fatal 41016 1727204210.07782: done checking for any_errors_fatal 41016 1727204210.07783: checking for max_fail_percentage 41016 1727204210.07784: done checking for max_fail_percentage 41016 1727204210.07785: checking to see if all hosts have failed and the running result is not ok 41016 1727204210.07786: done checking to see if all hosts have failed 41016 1727204210.07786: getting the remaining hosts for this loop 41016 1727204210.07788: done getting the remaining hosts for this loop 41016 1727204210.07791: getting the next task for host managed-node1 41016 1727204210.07797: done getting next task for host managed-node1 41016 1727204210.07801: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 41016 1727204210.07806: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204210.07822: getting variables 41016 1727204210.07824: in VariableManager get_vars() 41016 1727204210.07894: Calling all_inventory to load vars for managed-node1 41016 1727204210.07898: Calling groups_inventory to load vars for managed-node1 41016 1727204210.07900: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204210.07913: Calling all_plugins_play to load vars for managed-node1 41016 1727204210.07916: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204210.07920: Calling groups_plugins_play to load vars for managed-node1 41016 1727204210.09327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204210.12947: done with get_vars() 41016 1727204210.12983: done getting variables 41016 1727204210.13068: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:56:50 -0400 (0:00:00.962) 0:00:33.807 ***** 41016 1727204210.13114: entering _queue_task() for managed-node1/debug 41016 1727204210.13472: worker is 1 (out of 1 available) 41016 1727204210.13598: exiting _queue_task() for managed-node1/debug 41016 1727204210.13609: done queuing things up, now waiting for results queue to drain 41016 1727204210.13610: waiting for pending results... 41016 1727204210.13993: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider 41016 1727204210.13998: in run() - task 028d2410-947f-12d5-0ec4-000000000642 41016 1727204210.14003: variable 'ansible_search_path' from source: unknown 41016 1727204210.14006: variable 'ansible_search_path' from source: unknown 41016 1727204210.14043: calling self._execute() 41016 1727204210.14172: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204210.14178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204210.14250: variable 'omit' from source: magic vars 41016 1727204210.14671: variable 'ansible_distribution_major_version' from source: facts 41016 1727204210.14688: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204210.14693: variable 'omit' from source: magic vars 41016 1727204210.14753: variable 'omit' from source: magic vars 41016 1727204210.14854: variable 'network_provider' from source: set_fact 41016 1727204210.14870: variable 'omit' from source: magic vars 41016 1727204210.14916: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204210.14948: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204210.14974: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204210.15080: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204210.15083: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204210.15085: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204210.15088: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204210.15090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204210.15143: Set connection var ansible_shell_executable to /bin/sh 41016 1727204210.15148: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204210.15154: Set connection var ansible_shell_type to sh 41016 1727204210.15160: Set connection var ansible_timeout to 10 41016 1727204210.15165: Set connection var ansible_pipelining to False 41016 1727204210.15178: Set connection var ansible_connection to ssh 41016 1727204210.15200: variable 'ansible_shell_executable' from source: unknown 41016 1727204210.15203: variable 'ansible_connection' from source: unknown 41016 1727204210.15206: variable 'ansible_module_compression' from source: unknown 41016 1727204210.15209: variable 'ansible_shell_type' from source: unknown 41016 1727204210.15213: variable 'ansible_shell_executable' from source: unknown 41016 1727204210.15216: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204210.15218: variable 'ansible_pipelining' from source: unknown 41016 1727204210.15220: variable 'ansible_timeout' from source: unknown 41016 1727204210.15228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204210.15363: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204210.15373: variable 'omit' from source: magic vars 41016 1727204210.15580: starting attempt loop 41016 1727204210.15583: running the handler 41016 1727204210.15585: handler run complete 41016 1727204210.15587: attempt loop complete, returning result 41016 1727204210.15588: _execute() done 41016 1727204210.15590: dumping result to json 41016 1727204210.15592: done dumping result, returning 41016 1727204210.15593: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider [028d2410-947f-12d5-0ec4-000000000642] 41016 1727204210.15595: sending task result for task 028d2410-947f-12d5-0ec4-000000000642 41016 1727204210.15656: done sending task result for task 028d2410-947f-12d5-0ec4-000000000642 41016 1727204210.15661: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: Using network provider: nm 41016 1727204210.15729: no more pending results, returning what we have 41016 1727204210.15733: results queue empty 41016 1727204210.15734: checking for any_errors_fatal 41016 1727204210.15745: done checking for any_errors_fatal 41016 1727204210.15745: checking for max_fail_percentage 41016 1727204210.15747: done checking for max_fail_percentage 41016 1727204210.15749: checking to see if all hosts have failed and the running result is not ok 41016 1727204210.15749: done checking to see if all hosts have failed 41016 1727204210.15750: getting the remaining hosts for this loop 41016 1727204210.15752: done getting the remaining hosts for this loop 41016 1727204210.15755: getting the next task for host managed-node1 41016 1727204210.15768: done getting next task for host managed-node1 41016 1727204210.15772: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 41016 1727204210.15778: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204210.15791: getting variables 41016 1727204210.15793: in VariableManager get_vars() 41016 1727204210.15837: Calling all_inventory to load vars for managed-node1 41016 1727204210.15840: Calling groups_inventory to load vars for managed-node1 41016 1727204210.15842: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204210.15852: Calling all_plugins_play to load vars for managed-node1 41016 1727204210.15856: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204210.15859: Calling groups_plugins_play to load vars for managed-node1 41016 1727204210.17897: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204210.20878: done with get_vars() 41016 1727204210.20907: done getting variables 41016 1727204210.20966: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:56:50 -0400 (0:00:00.078) 0:00:33.886 ***** 41016 1727204210.21010: entering _queue_task() for managed-node1/fail 41016 1727204210.21367: worker is 1 (out of 1 available) 41016 1727204210.21380: exiting _queue_task() for managed-node1/fail 41016 1727204210.21393: done queuing things up, now waiting for results queue to drain 41016 1727204210.21395: waiting for pending results... 41016 1727204210.21713: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 41016 1727204210.21982: in run() - task 028d2410-947f-12d5-0ec4-000000000643 41016 1727204210.21987: variable 'ansible_search_path' from source: unknown 41016 1727204210.21991: variable 'ansible_search_path' from source: unknown 41016 1727204210.21993: calling self._execute() 41016 1727204210.22002: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204210.22009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204210.22021: variable 'omit' from source: magic vars 41016 1727204210.22420: variable 'ansible_distribution_major_version' from source: facts 41016 1727204210.22431: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204210.22561: variable 'network_state' from source: role '' defaults 41016 1727204210.22781: Evaluated conditional (network_state != {}): False 41016 1727204210.22785: when evaluation is False, skipping this task 41016 1727204210.22787: _execute() done 41016 1727204210.22789: dumping result to json 41016 1727204210.22791: done dumping result, returning 41016 1727204210.22793: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [028d2410-947f-12d5-0ec4-000000000643] 41016 1727204210.22795: sending task result for task 028d2410-947f-12d5-0ec4-000000000643 skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41016 1727204210.22901: no more pending results, returning what we have 41016 1727204210.22905: results queue empty 41016 1727204210.22906: checking for any_errors_fatal 41016 1727204210.22912: done checking for any_errors_fatal 41016 1727204210.22912: checking for max_fail_percentage 41016 1727204210.22914: done checking for max_fail_percentage 41016 1727204210.22915: checking to see if all hosts have failed and the running result is not ok 41016 1727204210.22916: done checking to see if all hosts have failed 41016 1727204210.22916: getting the remaining hosts for this loop 41016 1727204210.22918: done getting the remaining hosts for this loop 41016 1727204210.22921: getting the next task for host managed-node1 41016 1727204210.22960: done getting next task for host managed-node1 41016 1727204210.22965: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 41016 1727204210.22969: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204210.22988: getting variables 41016 1727204210.22990: in VariableManager get_vars() 41016 1727204210.23030: Calling all_inventory to load vars for managed-node1 41016 1727204210.23033: Calling groups_inventory to load vars for managed-node1 41016 1727204210.23036: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204210.23047: Calling all_plugins_play to load vars for managed-node1 41016 1727204210.23051: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204210.23054: Calling groups_plugins_play to load vars for managed-node1 41016 1727204210.23573: done sending task result for task 028d2410-947f-12d5-0ec4-000000000643 41016 1727204210.23578: WORKER PROCESS EXITING 41016 1727204210.25543: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204210.27446: done with get_vars() 41016 1727204210.27468: done getting variables 41016 1727204210.27531: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:56:50 -0400 (0:00:00.065) 0:00:33.951 ***** 41016 1727204210.27569: entering _queue_task() for managed-node1/fail 41016 1727204210.28013: worker is 1 (out of 1 available) 41016 1727204210.28024: exiting _queue_task() for managed-node1/fail 41016 1727204210.28036: done queuing things up, now waiting for results queue to drain 41016 1727204210.28038: waiting for pending results... 41016 1727204210.28270: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 41016 1727204210.28446: in run() - task 028d2410-947f-12d5-0ec4-000000000644 41016 1727204210.28460: variable 'ansible_search_path' from source: unknown 41016 1727204210.28464: variable 'ansible_search_path' from source: unknown 41016 1727204210.28516: calling self._execute() 41016 1727204210.28645: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204210.28648: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204210.28657: variable 'omit' from source: magic vars 41016 1727204210.29083: variable 'ansible_distribution_major_version' from source: facts 41016 1727204210.29094: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204210.29219: variable 'network_state' from source: role '' defaults 41016 1727204210.29230: Evaluated conditional (network_state != {}): False 41016 1727204210.29234: when evaluation is False, skipping this task 41016 1727204210.29237: _execute() done 41016 1727204210.29240: dumping result to json 41016 1727204210.29248: done dumping result, returning 41016 1727204210.29255: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [028d2410-947f-12d5-0ec4-000000000644] 41016 1727204210.29261: sending task result for task 028d2410-947f-12d5-0ec4-000000000644 41016 1727204210.29364: done sending task result for task 028d2410-947f-12d5-0ec4-000000000644 41016 1727204210.29368: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41016 1727204210.29440: no more pending results, returning what we have 41016 1727204210.29445: results queue empty 41016 1727204210.29446: checking for any_errors_fatal 41016 1727204210.29461: done checking for any_errors_fatal 41016 1727204210.29462: checking for max_fail_percentage 41016 1727204210.29464: done checking for max_fail_percentage 41016 1727204210.29465: checking to see if all hosts have failed and the running result is not ok 41016 1727204210.29466: done checking to see if all hosts have failed 41016 1727204210.29467: getting the remaining hosts for this loop 41016 1727204210.29469: done getting the remaining hosts for this loop 41016 1727204210.29473: getting the next task for host managed-node1 41016 1727204210.29484: done getting next task for host managed-node1 41016 1727204210.29489: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 41016 1727204210.29494: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204210.29520: getting variables 41016 1727204210.29522: in VariableManager get_vars() 41016 1727204210.29787: Calling all_inventory to load vars for managed-node1 41016 1727204210.29790: Calling groups_inventory to load vars for managed-node1 41016 1727204210.29792: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204210.29802: Calling all_plugins_play to load vars for managed-node1 41016 1727204210.29805: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204210.29808: Calling groups_plugins_play to load vars for managed-node1 41016 1727204210.31183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204210.32957: done with get_vars() 41016 1727204210.33007: done getting variables 41016 1727204210.33084: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:56:50 -0400 (0:00:00.055) 0:00:34.007 ***** 41016 1727204210.33122: entering _queue_task() for managed-node1/fail 41016 1727204210.33602: worker is 1 (out of 1 available) 41016 1727204210.33612: exiting _queue_task() for managed-node1/fail 41016 1727204210.33623: done queuing things up, now waiting for results queue to drain 41016 1727204210.33624: waiting for pending results... 41016 1727204210.33932: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 41016 1727204210.34016: in run() - task 028d2410-947f-12d5-0ec4-000000000645 41016 1727204210.34032: variable 'ansible_search_path' from source: unknown 41016 1727204210.34050: variable 'ansible_search_path' from source: unknown 41016 1727204210.34091: calling self._execute() 41016 1727204210.34171: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204210.34178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204210.34185: variable 'omit' from source: magic vars 41016 1727204210.34477: variable 'ansible_distribution_major_version' from source: facts 41016 1727204210.34486: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204210.34610: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41016 1727204210.36304: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41016 1727204210.36367: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41016 1727204210.36411: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41016 1727204210.36439: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41016 1727204210.36465: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41016 1727204210.36543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204210.36572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204210.36597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204210.36638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204210.36652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204210.36762: variable 'ansible_distribution_major_version' from source: facts 41016 1727204210.36771: Evaluated conditional (ansible_distribution_major_version | int > 9): True 41016 1727204210.36888: variable 'ansible_distribution' from source: facts 41016 1727204210.36892: variable '__network_rh_distros' from source: role '' defaults 41016 1727204210.36906: Evaluated conditional (ansible_distribution in __network_rh_distros): True 41016 1727204210.37138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204210.37154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204210.37174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204210.37204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204210.37215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204210.37247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204210.37270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204210.37305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204210.37331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204210.37342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204210.37370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204210.37389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204210.37411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204210.37438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204210.37448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204210.37644: variable 'network_connections' from source: include params 41016 1727204210.37653: variable 'interface0' from source: play vars 41016 1727204210.37705: variable 'interface0' from source: play vars 41016 1727204210.37712: variable 'interface1' from source: play vars 41016 1727204210.37757: variable 'interface1' from source: play vars 41016 1727204210.37764: variable 'network_state' from source: role '' defaults 41016 1727204210.37820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41016 1727204210.37939: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41016 1727204210.37966: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41016 1727204210.37990: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41016 1727204210.38026: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41016 1727204210.38060: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41016 1727204210.38079: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41016 1727204210.38097: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204210.38117: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41016 1727204210.38137: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 41016 1727204210.38143: when evaluation is False, skipping this task 41016 1727204210.38146: _execute() done 41016 1727204210.38150: dumping result to json 41016 1727204210.38152: done dumping result, returning 41016 1727204210.38162: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [028d2410-947f-12d5-0ec4-000000000645] 41016 1727204210.38165: sending task result for task 028d2410-947f-12d5-0ec4-000000000645 41016 1727204210.38249: done sending task result for task 028d2410-947f-12d5-0ec4-000000000645 41016 1727204210.38252: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 41016 1727204210.38312: no more pending results, returning what we have 41016 1727204210.38316: results queue empty 41016 1727204210.38317: checking for any_errors_fatal 41016 1727204210.38323: done checking for any_errors_fatal 41016 1727204210.38324: checking for max_fail_percentage 41016 1727204210.38325: done checking for max_fail_percentage 41016 1727204210.38326: checking to see if all hosts have failed and the running result is not ok 41016 1727204210.38327: done checking to see if all hosts have failed 41016 1727204210.38327: getting the remaining hosts for this loop 41016 1727204210.38329: done getting the remaining hosts for this loop 41016 1727204210.38333: getting the next task for host managed-node1 41016 1727204210.38341: done getting next task for host managed-node1 41016 1727204210.38344: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 41016 1727204210.38348: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204210.38367: getting variables 41016 1727204210.38368: in VariableManager get_vars() 41016 1727204210.38419: Calling all_inventory to load vars for managed-node1 41016 1727204210.38422: Calling groups_inventory to load vars for managed-node1 41016 1727204210.38425: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204210.38434: Calling all_plugins_play to load vars for managed-node1 41016 1727204210.38436: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204210.38439: Calling groups_plugins_play to load vars for managed-node1 41016 1727204210.39550: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204210.40846: done with get_vars() 41016 1727204210.40866: done getting variables 41016 1727204210.40911: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:56:50 -0400 (0:00:00.078) 0:00:34.085 ***** 41016 1727204210.40934: entering _queue_task() for managed-node1/dnf 41016 1727204210.41188: worker is 1 (out of 1 available) 41016 1727204210.41202: exiting _queue_task() for managed-node1/dnf 41016 1727204210.41214: done queuing things up, now waiting for results queue to drain 41016 1727204210.41216: waiting for pending results... 41016 1727204210.41403: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 41016 1727204210.41495: in run() - task 028d2410-947f-12d5-0ec4-000000000646 41016 1727204210.41507: variable 'ansible_search_path' from source: unknown 41016 1727204210.41511: variable 'ansible_search_path' from source: unknown 41016 1727204210.41542: calling self._execute() 41016 1727204210.41618: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204210.41623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204210.41631: variable 'omit' from source: magic vars 41016 1727204210.41913: variable 'ansible_distribution_major_version' from source: facts 41016 1727204210.41924: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204210.42059: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41016 1727204210.44198: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41016 1727204210.44210: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41016 1727204210.44250: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41016 1727204210.44286: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41016 1727204210.44317: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41016 1727204210.44394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204210.44426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204210.44449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204210.44479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204210.44489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204210.44580: variable 'ansible_distribution' from source: facts 41016 1727204210.44584: variable 'ansible_distribution_major_version' from source: facts 41016 1727204210.44597: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 41016 1727204210.44679: variable '__network_wireless_connections_defined' from source: role '' defaults 41016 1727204210.44765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204210.44783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204210.44800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204210.44827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204210.44838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204210.44867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204210.44886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204210.44903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204210.44929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204210.44939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204210.44966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204210.44985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204210.45001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204210.45027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204210.45037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204210.45143: variable 'network_connections' from source: include params 41016 1727204210.45153: variable 'interface0' from source: play vars 41016 1727204210.45206: variable 'interface0' from source: play vars 41016 1727204210.45218: variable 'interface1' from source: play vars 41016 1727204210.45258: variable 'interface1' from source: play vars 41016 1727204210.45313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41016 1727204210.45424: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41016 1727204210.45453: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41016 1727204210.45477: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41016 1727204210.45499: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41016 1727204210.45544: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41016 1727204210.45560: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41016 1727204210.45583: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204210.45601: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41016 1727204210.45639: variable '__network_team_connections_defined' from source: role '' defaults 41016 1727204210.45790: variable 'network_connections' from source: include params 41016 1727204210.45794: variable 'interface0' from source: play vars 41016 1727204210.45837: variable 'interface0' from source: play vars 41016 1727204210.45842: variable 'interface1' from source: play vars 41016 1727204210.45888: variable 'interface1' from source: play vars 41016 1727204210.45905: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41016 1727204210.45908: when evaluation is False, skipping this task 41016 1727204210.45914: _execute() done 41016 1727204210.45916: dumping result to json 41016 1727204210.45918: done dumping result, returning 41016 1727204210.45925: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [028d2410-947f-12d5-0ec4-000000000646] 41016 1727204210.45929: sending task result for task 028d2410-947f-12d5-0ec4-000000000646 41016 1727204210.46017: done sending task result for task 028d2410-947f-12d5-0ec4-000000000646 41016 1727204210.46020: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41016 1727204210.46070: no more pending results, returning what we have 41016 1727204210.46074: results queue empty 41016 1727204210.46077: checking for any_errors_fatal 41016 1727204210.46084: done checking for any_errors_fatal 41016 1727204210.46085: checking for max_fail_percentage 41016 1727204210.46087: done checking for max_fail_percentage 41016 1727204210.46088: checking to see if all hosts have failed and the running result is not ok 41016 1727204210.46089: done checking to see if all hosts have failed 41016 1727204210.46089: getting the remaining hosts for this loop 41016 1727204210.46091: done getting the remaining hosts for this loop 41016 1727204210.46094: getting the next task for host managed-node1 41016 1727204210.46102: done getting next task for host managed-node1 41016 1727204210.46106: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 41016 1727204210.46112: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204210.46132: getting variables 41016 1727204210.46134: in VariableManager get_vars() 41016 1727204210.46183: Calling all_inventory to load vars for managed-node1 41016 1727204210.46186: Calling groups_inventory to load vars for managed-node1 41016 1727204210.46189: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204210.46198: Calling all_plugins_play to load vars for managed-node1 41016 1727204210.46201: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204210.46203: Calling groups_plugins_play to load vars for managed-node1 41016 1727204210.47525: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204210.48670: done with get_vars() 41016 1727204210.48700: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 41016 1727204210.48787: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:56:50 -0400 (0:00:00.078) 0:00:34.164 ***** 41016 1727204210.48820: entering _queue_task() for managed-node1/yum 41016 1727204210.49231: worker is 1 (out of 1 available) 41016 1727204210.49244: exiting _queue_task() for managed-node1/yum 41016 1727204210.49256: done queuing things up, now waiting for results queue to drain 41016 1727204210.49258: waiting for pending results... 41016 1727204210.49511: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 41016 1727204210.49602: in run() - task 028d2410-947f-12d5-0ec4-000000000647 41016 1727204210.49616: variable 'ansible_search_path' from source: unknown 41016 1727204210.49620: variable 'ansible_search_path' from source: unknown 41016 1727204210.49655: calling self._execute() 41016 1727204210.49730: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204210.49734: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204210.49744: variable 'omit' from source: magic vars 41016 1727204210.50030: variable 'ansible_distribution_major_version' from source: facts 41016 1727204210.50039: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204210.50185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41016 1727204210.52011: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41016 1727204210.52330: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41016 1727204210.52334: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41016 1727204210.52363: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41016 1727204210.52393: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41016 1727204210.52470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204210.52567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204210.52571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204210.52574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204210.52578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204210.52672: variable 'ansible_distribution_major_version' from source: facts 41016 1727204210.52688: Evaluated conditional (ansible_distribution_major_version | int < 8): False 41016 1727204210.52691: when evaluation is False, skipping this task 41016 1727204210.52694: _execute() done 41016 1727204210.52697: dumping result to json 41016 1727204210.52764: done dumping result, returning 41016 1727204210.52767: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [028d2410-947f-12d5-0ec4-000000000647] 41016 1727204210.52769: sending task result for task 028d2410-947f-12d5-0ec4-000000000647 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 41016 1727204210.52920: no more pending results, returning what we have 41016 1727204210.52924: results queue empty 41016 1727204210.52925: checking for any_errors_fatal 41016 1727204210.52930: done checking for any_errors_fatal 41016 1727204210.52930: checking for max_fail_percentage 41016 1727204210.52932: done checking for max_fail_percentage 41016 1727204210.52933: checking to see if all hosts have failed and the running result is not ok 41016 1727204210.52934: done checking to see if all hosts have failed 41016 1727204210.52934: getting the remaining hosts for this loop 41016 1727204210.52936: done getting the remaining hosts for this loop 41016 1727204210.52939: getting the next task for host managed-node1 41016 1727204210.52946: done getting next task for host managed-node1 41016 1727204210.52950: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 41016 1727204210.52953: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204210.52973: getting variables 41016 1727204210.52977: in VariableManager get_vars() 41016 1727204210.53015: Calling all_inventory to load vars for managed-node1 41016 1727204210.53017: Calling groups_inventory to load vars for managed-node1 41016 1727204210.53020: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204210.53028: Calling all_plugins_play to load vars for managed-node1 41016 1727204210.53031: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204210.53033: Calling groups_plugins_play to load vars for managed-node1 41016 1727204210.53682: done sending task result for task 028d2410-947f-12d5-0ec4-000000000647 41016 1727204210.53685: WORKER PROCESS EXITING 41016 1727204210.59260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204210.60794: done with get_vars() 41016 1727204210.60827: done getting variables 41016 1727204210.60880: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:56:50 -0400 (0:00:00.120) 0:00:34.285 ***** 41016 1727204210.60912: entering _queue_task() for managed-node1/fail 41016 1727204210.61279: worker is 1 (out of 1 available) 41016 1727204210.61292: exiting _queue_task() for managed-node1/fail 41016 1727204210.61305: done queuing things up, now waiting for results queue to drain 41016 1727204210.61307: waiting for pending results... 41016 1727204210.61698: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 41016 1727204210.61777: in run() - task 028d2410-947f-12d5-0ec4-000000000648 41016 1727204210.61801: variable 'ansible_search_path' from source: unknown 41016 1727204210.61815: variable 'ansible_search_path' from source: unknown 41016 1727204210.61857: calling self._execute() 41016 1727204210.61962: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204210.61978: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204210.61994: variable 'omit' from source: magic vars 41016 1727204210.62407: variable 'ansible_distribution_major_version' from source: facts 41016 1727204210.62427: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204210.62561: variable '__network_wireless_connections_defined' from source: role '' defaults 41016 1727204210.62768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41016 1727204210.65078: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41016 1727204210.65165: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41016 1727204210.65268: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41016 1727204210.65271: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41016 1727204210.65277: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41016 1727204210.65358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204210.65399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204210.65434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204210.65518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204210.65927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204210.65931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204210.65933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204210.65936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204210.65938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204210.65940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204210.65991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204210.66019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204210.66050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204210.66097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204210.66116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204210.66301: variable 'network_connections' from source: include params 41016 1727204210.66382: variable 'interface0' from source: play vars 41016 1727204210.66404: variable 'interface0' from source: play vars 41016 1727204210.66419: variable 'interface1' from source: play vars 41016 1727204210.66483: variable 'interface1' from source: play vars 41016 1727204210.66560: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41016 1727204210.66751: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41016 1727204210.66794: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41016 1727204210.66832: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41016 1727204210.66867: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41016 1727204210.66915: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41016 1727204210.67035: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41016 1727204210.67038: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204210.67041: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41016 1727204210.67068: variable '__network_team_connections_defined' from source: role '' defaults 41016 1727204210.67343: variable 'network_connections' from source: include params 41016 1727204210.67578: variable 'interface0' from source: play vars 41016 1727204210.67583: variable 'interface0' from source: play vars 41016 1727204210.67585: variable 'interface1' from source: play vars 41016 1727204210.67587: variable 'interface1' from source: play vars 41016 1727204210.67588: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41016 1727204210.67590: when evaluation is False, skipping this task 41016 1727204210.67592: _execute() done 41016 1727204210.67594: dumping result to json 41016 1727204210.67596: done dumping result, returning 41016 1727204210.67598: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [028d2410-947f-12d5-0ec4-000000000648] 41016 1727204210.67607: sending task result for task 028d2410-947f-12d5-0ec4-000000000648 41016 1727204210.67698: done sending task result for task 028d2410-947f-12d5-0ec4-000000000648 41016 1727204210.67702: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41016 1727204210.67758: no more pending results, returning what we have 41016 1727204210.67762: results queue empty 41016 1727204210.67764: checking for any_errors_fatal 41016 1727204210.67773: done checking for any_errors_fatal 41016 1727204210.67774: checking for max_fail_percentage 41016 1727204210.67778: done checking for max_fail_percentage 41016 1727204210.67779: checking to see if all hosts have failed and the running result is not ok 41016 1727204210.67780: done checking to see if all hosts have failed 41016 1727204210.67781: getting the remaining hosts for this loop 41016 1727204210.67783: done getting the remaining hosts for this loop 41016 1727204210.67787: getting the next task for host managed-node1 41016 1727204210.67795: done getting next task for host managed-node1 41016 1727204210.67800: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 41016 1727204210.67804: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204210.67828: getting variables 41016 1727204210.67830: in VariableManager get_vars() 41016 1727204210.67893: Calling all_inventory to load vars for managed-node1 41016 1727204210.67896: Calling groups_inventory to load vars for managed-node1 41016 1727204210.67899: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204210.67941: Calling all_plugins_play to load vars for managed-node1 41016 1727204210.67945: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204210.67949: Calling groups_plugins_play to load vars for managed-node1 41016 1727204210.69611: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204210.71279: done with get_vars() 41016 1727204210.71305: done getting variables 41016 1727204210.71357: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:56:50 -0400 (0:00:00.104) 0:00:34.390 ***** 41016 1727204210.71392: entering _queue_task() for managed-node1/package 41016 1727204210.71720: worker is 1 (out of 1 available) 41016 1727204210.71734: exiting _queue_task() for managed-node1/package 41016 1727204210.71746: done queuing things up, now waiting for results queue to drain 41016 1727204210.71747: waiting for pending results... 41016 1727204210.72198: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages 41016 1727204210.72202: in run() - task 028d2410-947f-12d5-0ec4-000000000649 41016 1727204210.72205: variable 'ansible_search_path' from source: unknown 41016 1727204210.72208: variable 'ansible_search_path' from source: unknown 41016 1727204210.72248: calling self._execute() 41016 1727204210.72355: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204210.72366: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204210.72381: variable 'omit' from source: magic vars 41016 1727204210.72765: variable 'ansible_distribution_major_version' from source: facts 41016 1727204210.72784: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204210.72986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41016 1727204210.73272: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41016 1727204210.73382: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41016 1727204210.73385: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41016 1727204210.73438: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41016 1727204210.73559: variable 'network_packages' from source: role '' defaults 41016 1727204210.73673: variable '__network_provider_setup' from source: role '' defaults 41016 1727204210.73690: variable '__network_service_name_default_nm' from source: role '' defaults 41016 1727204210.73765: variable '__network_service_name_default_nm' from source: role '' defaults 41016 1727204210.73781: variable '__network_packages_default_nm' from source: role '' defaults 41016 1727204210.73848: variable '__network_packages_default_nm' from source: role '' defaults 41016 1727204210.74042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41016 1727204210.76330: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41016 1727204210.76401: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41016 1727204210.76580: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41016 1727204210.76584: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41016 1727204210.76586: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41016 1727204210.76588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204210.76620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204210.76648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204210.76694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204210.76718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204210.76765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204210.76796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204210.76830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204210.76872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204210.76892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204210.77144: variable '__network_packages_default_gobject_packages' from source: role '' defaults 41016 1727204210.77258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204210.77287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204210.77314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204210.77361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204210.77382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204210.77464: variable 'ansible_python' from source: facts 41016 1727204210.77491: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 41016 1727204210.77780: variable '__network_wpa_supplicant_required' from source: role '' defaults 41016 1727204210.77784: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41016 1727204210.77787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204210.77808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204210.77838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204210.77887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204210.77913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204210.77962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204210.78001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204210.78035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204210.78074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204210.78094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204210.78218: variable 'network_connections' from source: include params 41016 1727204210.78232: variable 'interface0' from source: play vars 41016 1727204210.78332: variable 'interface0' from source: play vars 41016 1727204210.78352: variable 'interface1' from source: play vars 41016 1727204210.78559: variable 'interface1' from source: play vars 41016 1727204210.78562: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41016 1727204210.78564: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41016 1727204210.78703: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204210.78883: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41016 1727204210.78887: variable '__network_wireless_connections_defined' from source: role '' defaults 41016 1727204210.79547: variable 'network_connections' from source: include params 41016 1727204210.79558: variable 'interface0' from source: play vars 41016 1727204210.79749: variable 'interface0' from source: play vars 41016 1727204210.79976: variable 'interface1' from source: play vars 41016 1727204210.80192: variable 'interface1' from source: play vars 41016 1727204210.80195: variable '__network_packages_default_wireless' from source: role '' defaults 41016 1727204210.80197: variable '__network_wireless_connections_defined' from source: role '' defaults 41016 1727204210.80874: variable 'network_connections' from source: include params 41016 1727204210.80889: variable 'interface0' from source: play vars 41016 1727204210.81282: variable 'interface0' from source: play vars 41016 1727204210.81285: variable 'interface1' from source: play vars 41016 1727204210.81287: variable 'interface1' from source: play vars 41016 1727204210.81289: variable '__network_packages_default_team' from source: role '' defaults 41016 1727204210.81365: variable '__network_team_connections_defined' from source: role '' defaults 41016 1727204210.82005: variable 'network_connections' from source: include params 41016 1727204210.82015: variable 'interface0' from source: play vars 41016 1727204210.82187: variable 'interface0' from source: play vars 41016 1727204210.82199: variable 'interface1' from source: play vars 41016 1727204210.82266: variable 'interface1' from source: play vars 41016 1727204210.82367: variable '__network_service_name_default_initscripts' from source: role '' defaults 41016 1727204210.82543: variable '__network_service_name_default_initscripts' from source: role '' defaults 41016 1727204210.82590: variable '__network_packages_default_initscripts' from source: role '' defaults 41016 1727204210.82696: variable '__network_packages_default_initscripts' from source: role '' defaults 41016 1727204210.83188: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 41016 1727204210.84027: variable 'network_connections' from source: include params 41016 1727204210.84187: variable 'interface0' from source: play vars 41016 1727204210.84239: variable 'interface0' from source: play vars 41016 1727204210.84253: variable 'interface1' from source: play vars 41016 1727204210.84432: variable 'interface1' from source: play vars 41016 1727204210.84445: variable 'ansible_distribution' from source: facts 41016 1727204210.84453: variable '__network_rh_distros' from source: role '' defaults 41016 1727204210.84464: variable 'ansible_distribution_major_version' from source: facts 41016 1727204210.84488: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 41016 1727204210.84870: variable 'ansible_distribution' from source: facts 41016 1727204210.84884: variable '__network_rh_distros' from source: role '' defaults 41016 1727204210.84894: variable 'ansible_distribution_major_version' from source: facts 41016 1727204210.84917: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 41016 1727204210.85352: variable 'ansible_distribution' from source: facts 41016 1727204210.85355: variable '__network_rh_distros' from source: role '' defaults 41016 1727204210.85357: variable 'ansible_distribution_major_version' from source: facts 41016 1727204210.85359: variable 'network_provider' from source: set_fact 41016 1727204210.85378: variable 'ansible_facts' from source: unknown 41016 1727204210.86757: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 41016 1727204210.86933: when evaluation is False, skipping this task 41016 1727204210.86936: _execute() done 41016 1727204210.86938: dumping result to json 41016 1727204210.86940: done dumping result, returning 41016 1727204210.86943: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages [028d2410-947f-12d5-0ec4-000000000649] 41016 1727204210.86945: sending task result for task 028d2410-947f-12d5-0ec4-000000000649 41016 1727204210.87023: done sending task result for task 028d2410-947f-12d5-0ec4-000000000649 41016 1727204210.87027: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 41016 1727204210.87083: no more pending results, returning what we have 41016 1727204210.87088: results queue empty 41016 1727204210.87089: checking for any_errors_fatal 41016 1727204210.87096: done checking for any_errors_fatal 41016 1727204210.87096: checking for max_fail_percentage 41016 1727204210.87098: done checking for max_fail_percentage 41016 1727204210.87099: checking to see if all hosts have failed and the running result is not ok 41016 1727204210.87100: done checking to see if all hosts have failed 41016 1727204210.87101: getting the remaining hosts for this loop 41016 1727204210.87103: done getting the remaining hosts for this loop 41016 1727204210.87107: getting the next task for host managed-node1 41016 1727204210.87114: done getting next task for host managed-node1 41016 1727204210.87119: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 41016 1727204210.87122: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204210.87149: getting variables 41016 1727204210.87151: in VariableManager get_vars() 41016 1727204210.87196: Calling all_inventory to load vars for managed-node1 41016 1727204210.87200: Calling groups_inventory to load vars for managed-node1 41016 1727204210.87203: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204210.87213: Calling all_plugins_play to load vars for managed-node1 41016 1727204210.87217: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204210.87220: Calling groups_plugins_play to load vars for managed-node1 41016 1727204210.89343: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204210.90867: done with get_vars() 41016 1727204210.90893: done getting variables 41016 1727204210.90954: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:56:50 -0400 (0:00:00.195) 0:00:34.585 ***** 41016 1727204210.90988: entering _queue_task() for managed-node1/package 41016 1727204210.91333: worker is 1 (out of 1 available) 41016 1727204210.91347: exiting _queue_task() for managed-node1/package 41016 1727204210.91359: done queuing things up, now waiting for results queue to drain 41016 1727204210.91361: waiting for pending results... 41016 1727204210.91652: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 41016 1727204210.91982: in run() - task 028d2410-947f-12d5-0ec4-00000000064a 41016 1727204210.91986: variable 'ansible_search_path' from source: unknown 41016 1727204210.91988: variable 'ansible_search_path' from source: unknown 41016 1727204210.91990: calling self._execute() 41016 1727204210.91993: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204210.91996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204210.91998: variable 'omit' from source: magic vars 41016 1727204210.92379: variable 'ansible_distribution_major_version' from source: facts 41016 1727204210.92395: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204210.92520: variable 'network_state' from source: role '' defaults 41016 1727204210.92535: Evaluated conditional (network_state != {}): False 41016 1727204210.92545: when evaluation is False, skipping this task 41016 1727204210.92553: _execute() done 41016 1727204210.92560: dumping result to json 41016 1727204210.92568: done dumping result, returning 41016 1727204210.92581: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [028d2410-947f-12d5-0ec4-00000000064a] 41016 1727204210.92592: sending task result for task 028d2410-947f-12d5-0ec4-00000000064a skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41016 1727204210.92742: no more pending results, returning what we have 41016 1727204210.92746: results queue empty 41016 1727204210.92747: checking for any_errors_fatal 41016 1727204210.92755: done checking for any_errors_fatal 41016 1727204210.92756: checking for max_fail_percentage 41016 1727204210.92758: done checking for max_fail_percentage 41016 1727204210.92759: checking to see if all hosts have failed and the running result is not ok 41016 1727204210.92760: done checking to see if all hosts have failed 41016 1727204210.92760: getting the remaining hosts for this loop 41016 1727204210.92762: done getting the remaining hosts for this loop 41016 1727204210.92766: getting the next task for host managed-node1 41016 1727204210.92774: done getting next task for host managed-node1 41016 1727204210.92779: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 41016 1727204210.92784: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204210.92808: getting variables 41016 1727204210.92810: in VariableManager get_vars() 41016 1727204210.93025: Calling all_inventory to load vars for managed-node1 41016 1727204210.93029: Calling groups_inventory to load vars for managed-node1 41016 1727204210.93032: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204210.93042: Calling all_plugins_play to load vars for managed-node1 41016 1727204210.93046: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204210.93049: Calling groups_plugins_play to load vars for managed-node1 41016 1727204210.93594: done sending task result for task 028d2410-947f-12d5-0ec4-00000000064a 41016 1727204210.93598: WORKER PROCESS EXITING 41016 1727204210.94525: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204210.96241: done with get_vars() 41016 1727204210.96264: done getting variables 41016 1727204210.96328: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:56:50 -0400 (0:00:00.053) 0:00:34.639 ***** 41016 1727204210.96362: entering _queue_task() for managed-node1/package 41016 1727204210.96716: worker is 1 (out of 1 available) 41016 1727204210.96729: exiting _queue_task() for managed-node1/package 41016 1727204210.96742: done queuing things up, now waiting for results queue to drain 41016 1727204210.96743: waiting for pending results... 41016 1727204210.97041: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 41016 1727204210.97180: in run() - task 028d2410-947f-12d5-0ec4-00000000064b 41016 1727204210.97204: variable 'ansible_search_path' from source: unknown 41016 1727204210.97213: variable 'ansible_search_path' from source: unknown 41016 1727204210.97252: calling self._execute() 41016 1727204210.97358: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204210.97370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204210.97385: variable 'omit' from source: magic vars 41016 1727204210.97780: variable 'ansible_distribution_major_version' from source: facts 41016 1727204210.97797: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204210.97926: variable 'network_state' from source: role '' defaults 41016 1727204210.97942: Evaluated conditional (network_state != {}): False 41016 1727204210.97951: when evaluation is False, skipping this task 41016 1727204210.97962: _execute() done 41016 1727204210.97970: dumping result to json 41016 1727204210.97980: done dumping result, returning 41016 1727204210.97991: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [028d2410-947f-12d5-0ec4-00000000064b] 41016 1727204210.98002: sending task result for task 028d2410-947f-12d5-0ec4-00000000064b skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41016 1727204210.98153: no more pending results, returning what we have 41016 1727204210.98158: results queue empty 41016 1727204210.98159: checking for any_errors_fatal 41016 1727204210.98165: done checking for any_errors_fatal 41016 1727204210.98166: checking for max_fail_percentage 41016 1727204210.98168: done checking for max_fail_percentage 41016 1727204210.98169: checking to see if all hosts have failed and the running result is not ok 41016 1727204210.98170: done checking to see if all hosts have failed 41016 1727204210.98171: getting the remaining hosts for this loop 41016 1727204210.98172: done getting the remaining hosts for this loop 41016 1727204210.98177: getting the next task for host managed-node1 41016 1727204210.98186: done getting next task for host managed-node1 41016 1727204210.98191: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 41016 1727204210.98196: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204210.98220: getting variables 41016 1727204210.98222: in VariableManager get_vars() 41016 1727204210.98268: Calling all_inventory to load vars for managed-node1 41016 1727204210.98271: Calling groups_inventory to load vars for managed-node1 41016 1727204210.98274: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204210.98590: Calling all_plugins_play to load vars for managed-node1 41016 1727204210.98594: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204210.98597: Calling groups_plugins_play to load vars for managed-node1 41016 1727204210.99288: done sending task result for task 028d2410-947f-12d5-0ec4-00000000064b 41016 1727204210.99292: WORKER PROCESS EXITING 41016 1727204210.99904: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204211.01472: done with get_vars() 41016 1727204211.01501: done getting variables 41016 1727204211.01563: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:56:51 -0400 (0:00:00.052) 0:00:34.692 ***** 41016 1727204211.01599: entering _queue_task() for managed-node1/service 41016 1727204211.01940: worker is 1 (out of 1 available) 41016 1727204211.01950: exiting _queue_task() for managed-node1/service 41016 1727204211.01963: done queuing things up, now waiting for results queue to drain 41016 1727204211.01964: waiting for pending results... 41016 1727204211.02258: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 41016 1727204211.02416: in run() - task 028d2410-947f-12d5-0ec4-00000000064c 41016 1727204211.02437: variable 'ansible_search_path' from source: unknown 41016 1727204211.02446: variable 'ansible_search_path' from source: unknown 41016 1727204211.02487: calling self._execute() 41016 1727204211.02588: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204211.02599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204211.02617: variable 'omit' from source: magic vars 41016 1727204211.03002: variable 'ansible_distribution_major_version' from source: facts 41016 1727204211.03018: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204211.03145: variable '__network_wireless_connections_defined' from source: role '' defaults 41016 1727204211.03346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41016 1727204211.05527: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41016 1727204211.05607: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41016 1727204211.05652: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41016 1727204211.05695: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41016 1727204211.05727: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41016 1727204211.05809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204211.05845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204211.05882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204211.05926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204211.05944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204211.06078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204211.06081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204211.06084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204211.06091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204211.06110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204211.06152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204211.06182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204211.06210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204211.06250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204211.06269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204211.06450: variable 'network_connections' from source: include params 41016 1727204211.06468: variable 'interface0' from source: play vars 41016 1727204211.06552: variable 'interface0' from source: play vars 41016 1727204211.06568: variable 'interface1' from source: play vars 41016 1727204211.06642: variable 'interface1' from source: play vars 41016 1727204211.06720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41016 1727204211.07260: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41016 1727204211.07304: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41016 1727204211.07342: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41016 1727204211.07372: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41016 1727204211.07420: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41016 1727204211.07452: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41016 1727204211.07486: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204211.07581: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41016 1727204211.07653: variable '__network_team_connections_defined' from source: role '' defaults 41016 1727204211.08304: variable 'network_connections' from source: include params 41016 1727204211.08307: variable 'interface0' from source: play vars 41016 1727204211.08310: variable 'interface0' from source: play vars 41016 1727204211.08312: variable 'interface1' from source: play vars 41016 1727204211.08435: variable 'interface1' from source: play vars 41016 1727204211.08580: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 41016 1727204211.08584: when evaluation is False, skipping this task 41016 1727204211.08586: _execute() done 41016 1727204211.08588: dumping result to json 41016 1727204211.08591: done dumping result, returning 41016 1727204211.08593: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [028d2410-947f-12d5-0ec4-00000000064c] 41016 1727204211.08602: sending task result for task 028d2410-947f-12d5-0ec4-00000000064c skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 41016 1727204211.08781: no more pending results, returning what we have 41016 1727204211.08786: results queue empty 41016 1727204211.08787: checking for any_errors_fatal 41016 1727204211.08794: done checking for any_errors_fatal 41016 1727204211.08795: checking for max_fail_percentage 41016 1727204211.08796: done checking for max_fail_percentage 41016 1727204211.08797: checking to see if all hosts have failed and the running result is not ok 41016 1727204211.08798: done checking to see if all hosts have failed 41016 1727204211.08799: getting the remaining hosts for this loop 41016 1727204211.08800: done getting the remaining hosts for this loop 41016 1727204211.08805: getting the next task for host managed-node1 41016 1727204211.08813: done getting next task for host managed-node1 41016 1727204211.08818: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 41016 1727204211.08822: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204211.08845: getting variables 41016 1727204211.08847: in VariableManager get_vars() 41016 1727204211.08893: Calling all_inventory to load vars for managed-node1 41016 1727204211.08896: Calling groups_inventory to load vars for managed-node1 41016 1727204211.08899: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204211.08908: Calling all_plugins_play to load vars for managed-node1 41016 1727204211.08911: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204211.08915: Calling groups_plugins_play to load vars for managed-node1 41016 1727204211.09731: done sending task result for task 028d2410-947f-12d5-0ec4-00000000064c 41016 1727204211.09735: WORKER PROCESS EXITING 41016 1727204211.10952: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204211.13553: done with get_vars() 41016 1727204211.13583: done getting variables 41016 1727204211.13646: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:56:51 -0400 (0:00:00.120) 0:00:34.812 ***** 41016 1727204211.13678: entering _queue_task() for managed-node1/service 41016 1727204211.14058: worker is 1 (out of 1 available) 41016 1727204211.14070: exiting _queue_task() for managed-node1/service 41016 1727204211.14086: done queuing things up, now waiting for results queue to drain 41016 1727204211.14088: waiting for pending results... 41016 1727204211.14364: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 41016 1727204211.14498: in run() - task 028d2410-947f-12d5-0ec4-00000000064d 41016 1727204211.14519: variable 'ansible_search_path' from source: unknown 41016 1727204211.14529: variable 'ansible_search_path' from source: unknown 41016 1727204211.14571: calling self._execute() 41016 1727204211.14670: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204211.14684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204211.14700: variable 'omit' from source: magic vars 41016 1727204211.15077: variable 'ansible_distribution_major_version' from source: facts 41016 1727204211.15095: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204211.15269: variable 'network_provider' from source: set_fact 41016 1727204211.15285: variable 'network_state' from source: role '' defaults 41016 1727204211.15311: Evaluated conditional (network_provider == "nm" or network_state != {}): True 41016 1727204211.15324: variable 'omit' from source: magic vars 41016 1727204211.15382: variable 'omit' from source: magic vars 41016 1727204211.15681: variable 'network_service_name' from source: role '' defaults 41016 1727204211.15684: variable 'network_service_name' from source: role '' defaults 41016 1727204211.15981: variable '__network_provider_setup' from source: role '' defaults 41016 1727204211.15984: variable '__network_service_name_default_nm' from source: role '' defaults 41016 1727204211.15987: variable '__network_service_name_default_nm' from source: role '' defaults 41016 1727204211.15989: variable '__network_packages_default_nm' from source: role '' defaults 41016 1727204211.16099: variable '__network_packages_default_nm' from source: role '' defaults 41016 1727204211.16467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41016 1727204211.18642: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41016 1727204211.18725: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41016 1727204211.18765: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41016 1727204211.18806: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41016 1727204211.18836: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41016 1727204211.18919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204211.18953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204211.18984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204211.19028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204211.19046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204211.19094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204211.19122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204211.19149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204211.19192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204211.19209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204211.19443: variable '__network_packages_default_gobject_packages' from source: role '' defaults 41016 1727204211.20080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204211.20084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204211.20086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204211.20089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204211.20092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204211.20094: variable 'ansible_python' from source: facts 41016 1727204211.20303: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 41016 1727204211.20390: variable '__network_wpa_supplicant_required' from source: role '' defaults 41016 1727204211.20636: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41016 1727204211.20761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204211.20910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204211.20942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204211.20986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204211.21065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204211.21125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204211.21160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204211.21212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204211.21281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204211.21334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204211.22082: variable 'network_connections' from source: include params 41016 1727204211.22085: variable 'interface0' from source: play vars 41016 1727204211.22088: variable 'interface0' from source: play vars 41016 1727204211.22091: variable 'interface1' from source: play vars 41016 1727204211.22093: variable 'interface1' from source: play vars 41016 1727204211.22397: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41016 1727204211.22598: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41016 1727204211.22848: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41016 1727204211.22980: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41016 1727204211.23105: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41016 1727204211.23170: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41016 1727204211.23214: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41016 1727204211.23255: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204211.23295: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41016 1727204211.23346: variable '__network_wireless_connections_defined' from source: role '' defaults 41016 1727204211.23629: variable 'network_connections' from source: include params 41016 1727204211.23641: variable 'interface0' from source: play vars 41016 1727204211.23718: variable 'interface0' from source: play vars 41016 1727204211.23734: variable 'interface1' from source: play vars 41016 1727204211.23807: variable 'interface1' from source: play vars 41016 1727204211.23845: variable '__network_packages_default_wireless' from source: role '' defaults 41016 1727204211.23928: variable '__network_wireless_connections_defined' from source: role '' defaults 41016 1727204211.24205: variable 'network_connections' from source: include params 41016 1727204211.24215: variable 'interface0' from source: play vars 41016 1727204211.24278: variable 'interface0' from source: play vars 41016 1727204211.24290: variable 'interface1' from source: play vars 41016 1727204211.24350: variable 'interface1' from source: play vars 41016 1727204211.24374: variable '__network_packages_default_team' from source: role '' defaults 41016 1727204211.24446: variable '__network_team_connections_defined' from source: role '' defaults 41016 1727204211.24735: variable 'network_connections' from source: include params 41016 1727204211.24745: variable 'interface0' from source: play vars 41016 1727204211.24936: variable 'interface0' from source: play vars 41016 1727204211.25039: variable 'interface1' from source: play vars 41016 1727204211.25495: variable 'interface1' from source: play vars 41016 1727204211.25499: variable '__network_service_name_default_initscripts' from source: role '' defaults 41016 1727204211.25501: variable '__network_service_name_default_initscripts' from source: role '' defaults 41016 1727204211.25505: variable '__network_packages_default_initscripts' from source: role '' defaults 41016 1727204211.25591: variable '__network_packages_default_initscripts' from source: role '' defaults 41016 1727204211.26007: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 41016 1727204211.26806: variable 'network_connections' from source: include params 41016 1727204211.26818: variable 'interface0' from source: play vars 41016 1727204211.26900: variable 'interface0' from source: play vars 41016 1727204211.26913: variable 'interface1' from source: play vars 41016 1727204211.26977: variable 'interface1' from source: play vars 41016 1727204211.26991: variable 'ansible_distribution' from source: facts 41016 1727204211.27001: variable '__network_rh_distros' from source: role '' defaults 41016 1727204211.27012: variable 'ansible_distribution_major_version' from source: facts 41016 1727204211.27036: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 41016 1727204211.27208: variable 'ansible_distribution' from source: facts 41016 1727204211.27218: variable '__network_rh_distros' from source: role '' defaults 41016 1727204211.27229: variable 'ansible_distribution_major_version' from source: facts 41016 1727204211.27246: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 41016 1727204211.27423: variable 'ansible_distribution' from source: facts 41016 1727204211.27432: variable '__network_rh_distros' from source: role '' defaults 41016 1727204211.27443: variable 'ansible_distribution_major_version' from source: facts 41016 1727204211.27484: variable 'network_provider' from source: set_fact 41016 1727204211.27514: variable 'omit' from source: magic vars 41016 1727204211.27549: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204211.27586: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204211.27611: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204211.27634: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204211.27651: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204211.27688: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204211.27699: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204211.27709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204211.27815: Set connection var ansible_shell_executable to /bin/sh 41016 1727204211.27827: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204211.27838: Set connection var ansible_shell_type to sh 41016 1727204211.27850: Set connection var ansible_timeout to 10 41016 1727204211.27859: Set connection var ansible_pipelining to False 41016 1727204211.27870: Set connection var ansible_connection to ssh 41016 1727204211.27901: variable 'ansible_shell_executable' from source: unknown 41016 1727204211.27981: variable 'ansible_connection' from source: unknown 41016 1727204211.27985: variable 'ansible_module_compression' from source: unknown 41016 1727204211.27988: variable 'ansible_shell_type' from source: unknown 41016 1727204211.27990: variable 'ansible_shell_executable' from source: unknown 41016 1727204211.27997: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204211.27999: variable 'ansible_pipelining' from source: unknown 41016 1727204211.28001: variable 'ansible_timeout' from source: unknown 41016 1727204211.28003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204211.28066: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204211.28085: variable 'omit' from source: magic vars 41016 1727204211.28096: starting attempt loop 41016 1727204211.28104: running the handler 41016 1727204211.28188: variable 'ansible_facts' from source: unknown 41016 1727204211.28933: _low_level_execute_command(): starting 41016 1727204211.28948: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204211.29711: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204211.29728: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204211.29744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204211.29763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204211.29782: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204211.29796: stderr chunk (state=3): >>>debug2: match not found <<< 41016 1727204211.29967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204211.30095: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204211.30216: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204211.31993: stdout chunk (state=3): >>>/root <<< 41016 1727204211.32163: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204211.32178: stdout chunk (state=3): >>><<< 41016 1727204211.32193: stderr chunk (state=3): >>><<< 41016 1727204211.32219: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204211.32471: _low_level_execute_command(): starting 41016 1727204211.32475: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204211.3238392-43245-128584715063727 `" && echo ansible-tmp-1727204211.3238392-43245-128584715063727="` echo /root/.ansible/tmp/ansible-tmp-1727204211.3238392-43245-128584715063727 `" ) && sleep 0' 41016 1727204211.33166: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204211.33181: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41016 1727204211.33236: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204211.33280: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204211.33292: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204211.33300: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204211.33414: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204211.35520: stdout chunk (state=3): >>>ansible-tmp-1727204211.3238392-43245-128584715063727=/root/.ansible/tmp/ansible-tmp-1727204211.3238392-43245-128584715063727 <<< 41016 1727204211.35668: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204211.35671: stdout chunk (state=3): >>><<< 41016 1727204211.35674: stderr chunk (state=3): >>><<< 41016 1727204211.35697: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204211.3238392-43245-128584715063727=/root/.ansible/tmp/ansible-tmp-1727204211.3238392-43245-128584715063727 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204211.35744: variable 'ansible_module_compression' from source: unknown 41016 1727204211.35881: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-410168h8uvyln/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 41016 1727204211.35888: variable 'ansible_facts' from source: unknown 41016 1727204211.36147: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204211.3238392-43245-128584715063727/AnsiballZ_systemd.py 41016 1727204211.36360: Sending initial data 41016 1727204211.36379: Sent initial data (156 bytes) 41016 1727204211.36849: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204211.36852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204211.36855: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204211.36857: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204211.36859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 41016 1727204211.36861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204211.36906: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204211.36919: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204211.37009: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204211.38797: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204211.38885: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204211.38968: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmprnmsx705 /root/.ansible/tmp/ansible-tmp-1727204211.3238392-43245-128584715063727/AnsiballZ_systemd.py <<< 41016 1727204211.38972: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204211.3238392-43245-128584715063727/AnsiballZ_systemd.py" <<< 41016 1727204211.39058: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmprnmsx705" to remote "/root/.ansible/tmp/ansible-tmp-1727204211.3238392-43245-128584715063727/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204211.3238392-43245-128584715063727/AnsiballZ_systemd.py" <<< 41016 1727204211.40400: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204211.40471: stderr chunk (state=3): >>><<< 41016 1727204211.40474: stdout chunk (state=3): >>><<< 41016 1727204211.40478: done transferring module to remote 41016 1727204211.40481: _low_level_execute_command(): starting 41016 1727204211.40483: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204211.3238392-43245-128584715063727/ /root/.ansible/tmp/ansible-tmp-1727204211.3238392-43245-128584715063727/AnsiballZ_systemd.py && sleep 0' 41016 1727204211.40862: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204211.40893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204211.40896: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 41016 1727204211.40898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204211.40900: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204211.40902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204211.40952: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204211.40955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204211.41061: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204211.42962: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204211.42987: stderr chunk (state=3): >>><<< 41016 1727204211.42990: stdout chunk (state=3): >>><<< 41016 1727204211.43003: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204211.43006: _low_level_execute_command(): starting 41016 1727204211.43014: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204211.3238392-43245-128584715063727/AnsiballZ_systemd.py && sleep 0' 41016 1727204211.43425: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204211.43428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 41016 1727204211.43433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204211.43438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204211.43440: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204211.43488: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204211.43495: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204211.43496: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204211.43579: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204211.74856: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainStartTimestampMonotonic": "33322039", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainHandoffTimestampMonotonic": "33336258", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10752000", "MemoryPeak": "13869056", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3285377024", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1630687000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target shutdown.target multi-user.target", "After": "network-pre.target sysinit.target system.slice basic.target dbus.socket systemd-journald.socket cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:22 EDT", "StateChangeTimestampMonotonic": "413618667", "InactiveExitTimestamp": "Tue 2024-09-24 14:44:02 EDT", "InactiveExitTimestampMonotonic": "33322542", "ActiveEnterTimestamp": "Tue 2024-09-24 14:44:03 EDT", "ActiveEnterTimestampMonotonic": "34680535", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ConditionTimestampMonotonic": "33321151", "AssertTimestamp": "Tue 2024-09-24 14:44:02 EDT", "AssertTimestampMonotonic": "33321155", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "53c91cc8356748b484feba73dc5ee144", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 41016 1727204211.77058: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 41016 1727204211.77062: stdout chunk (state=3): >>><<< 41016 1727204211.77068: stderr chunk (state=3): >>><<< 41016 1727204211.77120: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainStartTimestampMonotonic": "33322039", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainHandoffTimestampMonotonic": "33336258", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10752000", "MemoryPeak": "13869056", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3285377024", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "1630687000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target shutdown.target multi-user.target", "After": "network-pre.target sysinit.target system.slice basic.target dbus.socket systemd-journald.socket cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:22 EDT", "StateChangeTimestampMonotonic": "413618667", "InactiveExitTimestamp": "Tue 2024-09-24 14:44:02 EDT", "InactiveExitTimestampMonotonic": "33322542", "ActiveEnterTimestamp": "Tue 2024-09-24 14:44:03 EDT", "ActiveEnterTimestampMonotonic": "34680535", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ConditionTimestampMonotonic": "33321151", "AssertTimestamp": "Tue 2024-09-24 14:44:02 EDT", "AssertTimestampMonotonic": "33321155", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "53c91cc8356748b484feba73dc5ee144", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204211.77516: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204211.3238392-43245-128584715063727/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204211.77536: _low_level_execute_command(): starting 41016 1727204211.77540: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204211.3238392-43245-128584715063727/ > /dev/null 2>&1 && sleep 0' 41016 1727204211.78800: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204211.79127: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204211.79314: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204211.81286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204211.81336: stderr chunk (state=3): >>><<< 41016 1727204211.81344: stdout chunk (state=3): >>><<< 41016 1727204211.81366: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204211.81374: handler run complete 41016 1727204211.81454: attempt loop complete, returning result 41016 1727204211.81457: _execute() done 41016 1727204211.81459: dumping result to json 41016 1727204211.81608: done dumping result, returning 41016 1727204211.81672: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [028d2410-947f-12d5-0ec4-00000000064d] 41016 1727204211.81675: sending task result for task 028d2410-947f-12d5-0ec4-00000000064d ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41016 1727204211.82182: no more pending results, returning what we have 41016 1727204211.82186: results queue empty 41016 1727204211.82188: checking for any_errors_fatal 41016 1727204211.82193: done checking for any_errors_fatal 41016 1727204211.82194: checking for max_fail_percentage 41016 1727204211.82196: done checking for max_fail_percentage 41016 1727204211.82197: checking to see if all hosts have failed and the running result is not ok 41016 1727204211.82197: done checking to see if all hosts have failed 41016 1727204211.82198: getting the remaining hosts for this loop 41016 1727204211.82199: done getting the remaining hosts for this loop 41016 1727204211.82202: getting the next task for host managed-node1 41016 1727204211.82212: done getting next task for host managed-node1 41016 1727204211.82215: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 41016 1727204211.82219: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204211.82232: getting variables 41016 1727204211.82234: in VariableManager get_vars() 41016 1727204211.82381: Calling all_inventory to load vars for managed-node1 41016 1727204211.82384: Calling groups_inventory to load vars for managed-node1 41016 1727204211.82387: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204211.82397: Calling all_plugins_play to load vars for managed-node1 41016 1727204211.82399: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204211.82402: Calling groups_plugins_play to load vars for managed-node1 41016 1727204211.83174: done sending task result for task 028d2410-947f-12d5-0ec4-00000000064d 41016 1727204211.83179: WORKER PROCESS EXITING 41016 1727204211.85209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204211.89191: done with get_vars() 41016 1727204211.89222: done getting variables 41016 1727204211.89486: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:56:51 -0400 (0:00:00.758) 0:00:35.571 ***** 41016 1727204211.89523: entering _queue_task() for managed-node1/service 41016 1727204211.90305: worker is 1 (out of 1 available) 41016 1727204211.90317: exiting _queue_task() for managed-node1/service 41016 1727204211.90327: done queuing things up, now waiting for results queue to drain 41016 1727204211.90328: waiting for pending results... 41016 1727204211.90578: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 41016 1727204211.90893: in run() - task 028d2410-947f-12d5-0ec4-00000000064e 41016 1727204211.90906: variable 'ansible_search_path' from source: unknown 41016 1727204211.90913: variable 'ansible_search_path' from source: unknown 41016 1727204211.90946: calling self._execute() 41016 1727204211.91040: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204211.91044: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204211.91054: variable 'omit' from source: magic vars 41016 1727204211.91831: variable 'ansible_distribution_major_version' from source: facts 41016 1727204211.91841: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204211.92154: variable 'network_provider' from source: set_fact 41016 1727204211.92160: Evaluated conditional (network_provider == "nm"): True 41016 1727204211.92250: variable '__network_wpa_supplicant_required' from source: role '' defaults 41016 1727204211.92538: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 41016 1727204211.92890: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41016 1727204211.97690: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41016 1727204211.97754: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41016 1727204211.97789: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41016 1727204211.97822: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41016 1727204211.97847: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41016 1727204211.98128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204211.98161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204211.98387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204211.98427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204211.98442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204211.98492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204211.98516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204211.98557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204211.98582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204211.98596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204211.98636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204211.98658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204211.98884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204211.98924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204211.98939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204211.99286: variable 'network_connections' from source: include params 41016 1727204211.99300: variable 'interface0' from source: play vars 41016 1727204211.99379: variable 'interface0' from source: play vars 41016 1727204211.99391: variable 'interface1' from source: play vars 41016 1727204211.99453: variable 'interface1' from source: play vars 41016 1727204211.99747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 41016 1727204212.00133: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 41016 1727204212.00151: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 41016 1727204212.00183: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 41016 1727204212.00214: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 41016 1727204212.00350: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 41016 1727204212.00353: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 41016 1727204212.00502: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204212.00528: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 41016 1727204212.00579: variable '__network_wireless_connections_defined' from source: role '' defaults 41016 1727204212.01014: variable 'network_connections' from source: include params 41016 1727204212.01017: variable 'interface0' from source: play vars 41016 1727204212.01079: variable 'interface0' from source: play vars 41016 1727204212.01290: variable 'interface1' from source: play vars 41016 1727204212.01348: variable 'interface1' from source: play vars 41016 1727204212.01379: Evaluated conditional (__network_wpa_supplicant_required): False 41016 1727204212.01521: when evaluation is False, skipping this task 41016 1727204212.01596: _execute() done 41016 1727204212.01599: dumping result to json 41016 1727204212.01601: done dumping result, returning 41016 1727204212.01604: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [028d2410-947f-12d5-0ec4-00000000064e] 41016 1727204212.01615: sending task result for task 028d2410-947f-12d5-0ec4-00000000064e 41016 1727204212.01764: done sending task result for task 028d2410-947f-12d5-0ec4-00000000064e 41016 1727204212.01767: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 41016 1727204212.01845: no more pending results, returning what we have 41016 1727204212.01850: results queue empty 41016 1727204212.01851: checking for any_errors_fatal 41016 1727204212.01874: done checking for any_errors_fatal 41016 1727204212.01875: checking for max_fail_percentage 41016 1727204212.01879: done checking for max_fail_percentage 41016 1727204212.01880: checking to see if all hosts have failed and the running result is not ok 41016 1727204212.01881: done checking to see if all hosts have failed 41016 1727204212.01882: getting the remaining hosts for this loop 41016 1727204212.01883: done getting the remaining hosts for this loop 41016 1727204212.01887: getting the next task for host managed-node1 41016 1727204212.01896: done getting next task for host managed-node1 41016 1727204212.01900: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 41016 1727204212.01905: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204212.01927: getting variables 41016 1727204212.01929: in VariableManager get_vars() 41016 1727204212.01975: Calling all_inventory to load vars for managed-node1 41016 1727204212.02193: Calling groups_inventory to load vars for managed-node1 41016 1727204212.02197: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204212.02208: Calling all_plugins_play to load vars for managed-node1 41016 1727204212.02216: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204212.02220: Calling groups_plugins_play to load vars for managed-node1 41016 1727204212.04621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204212.06496: done with get_vars() 41016 1727204212.06526: done getting variables 41016 1727204212.06595: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:56:52 -0400 (0:00:00.171) 0:00:35.742 ***** 41016 1727204212.06632: entering _queue_task() for managed-node1/service 41016 1727204212.07317: worker is 1 (out of 1 available) 41016 1727204212.07328: exiting _queue_task() for managed-node1/service 41016 1727204212.07340: done queuing things up, now waiting for results queue to drain 41016 1727204212.07342: waiting for pending results... 41016 1727204212.07972: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service 41016 1727204212.08106: in run() - task 028d2410-947f-12d5-0ec4-00000000064f 41016 1727204212.08119: variable 'ansible_search_path' from source: unknown 41016 1727204212.08122: variable 'ansible_search_path' from source: unknown 41016 1727204212.08160: calling self._execute() 41016 1727204212.08526: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204212.08530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204212.08532: variable 'omit' from source: magic vars 41016 1727204212.08934: variable 'ansible_distribution_major_version' from source: facts 41016 1727204212.08945: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204212.09060: variable 'network_provider' from source: set_fact 41016 1727204212.09071: Evaluated conditional (network_provider == "initscripts"): False 41016 1727204212.09075: when evaluation is False, skipping this task 41016 1727204212.09080: _execute() done 41016 1727204212.09083: dumping result to json 41016 1727204212.09086: done dumping result, returning 41016 1727204212.09163: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service [028d2410-947f-12d5-0ec4-00000000064f] 41016 1727204212.09166: sending task result for task 028d2410-947f-12d5-0ec4-00000000064f 41016 1727204212.09229: done sending task result for task 028d2410-947f-12d5-0ec4-00000000064f 41016 1727204212.09232: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 41016 1727204212.09313: no more pending results, returning what we have 41016 1727204212.09317: results queue empty 41016 1727204212.09318: checking for any_errors_fatal 41016 1727204212.09324: done checking for any_errors_fatal 41016 1727204212.09324: checking for max_fail_percentage 41016 1727204212.09326: done checking for max_fail_percentage 41016 1727204212.09327: checking to see if all hosts have failed and the running result is not ok 41016 1727204212.09328: done checking to see if all hosts have failed 41016 1727204212.09328: getting the remaining hosts for this loop 41016 1727204212.09330: done getting the remaining hosts for this loop 41016 1727204212.09333: getting the next task for host managed-node1 41016 1727204212.09339: done getting next task for host managed-node1 41016 1727204212.09342: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 41016 1727204212.09346: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204212.09363: getting variables 41016 1727204212.09364: in VariableManager get_vars() 41016 1727204212.09407: Calling all_inventory to load vars for managed-node1 41016 1727204212.09412: Calling groups_inventory to load vars for managed-node1 41016 1727204212.09414: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204212.09423: Calling all_plugins_play to load vars for managed-node1 41016 1727204212.09425: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204212.09428: Calling groups_plugins_play to load vars for managed-node1 41016 1727204212.11305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204212.13599: done with get_vars() 41016 1727204212.13615: done getting variables 41016 1727204212.13659: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:56:52 -0400 (0:00:00.070) 0:00:35.812 ***** 41016 1727204212.13685: entering _queue_task() for managed-node1/copy 41016 1727204212.13932: worker is 1 (out of 1 available) 41016 1727204212.13944: exiting _queue_task() for managed-node1/copy 41016 1727204212.13956: done queuing things up, now waiting for results queue to drain 41016 1727204212.13958: waiting for pending results... 41016 1727204212.14150: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 41016 1727204212.14234: in run() - task 028d2410-947f-12d5-0ec4-000000000650 41016 1727204212.14245: variable 'ansible_search_path' from source: unknown 41016 1727204212.14248: variable 'ansible_search_path' from source: unknown 41016 1727204212.14277: calling self._execute() 41016 1727204212.14353: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204212.14356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204212.14364: variable 'omit' from source: magic vars 41016 1727204212.14650: variable 'ansible_distribution_major_version' from source: facts 41016 1727204212.14659: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204212.14743: variable 'network_provider' from source: set_fact 41016 1727204212.14746: Evaluated conditional (network_provider == "initscripts"): False 41016 1727204212.14750: when evaluation is False, skipping this task 41016 1727204212.14754: _execute() done 41016 1727204212.14756: dumping result to json 41016 1727204212.14761: done dumping result, returning 41016 1727204212.14769: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [028d2410-947f-12d5-0ec4-000000000650] 41016 1727204212.14772: sending task result for task 028d2410-947f-12d5-0ec4-000000000650 41016 1727204212.14935: done sending task result for task 028d2410-947f-12d5-0ec4-000000000650 41016 1727204212.14938: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 41016 1727204212.15004: no more pending results, returning what we have 41016 1727204212.15008: results queue empty 41016 1727204212.15009: checking for any_errors_fatal 41016 1727204212.15013: done checking for any_errors_fatal 41016 1727204212.15014: checking for max_fail_percentage 41016 1727204212.15015: done checking for max_fail_percentage 41016 1727204212.15016: checking to see if all hosts have failed and the running result is not ok 41016 1727204212.15016: done checking to see if all hosts have failed 41016 1727204212.15017: getting the remaining hosts for this loop 41016 1727204212.15018: done getting the remaining hosts for this loop 41016 1727204212.15021: getting the next task for host managed-node1 41016 1727204212.15028: done getting next task for host managed-node1 41016 1727204212.15032: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 41016 1727204212.15035: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204212.15054: getting variables 41016 1727204212.15056: in VariableManager get_vars() 41016 1727204212.15093: Calling all_inventory to load vars for managed-node1 41016 1727204212.15095: Calling groups_inventory to load vars for managed-node1 41016 1727204212.15100: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204212.15111: Calling all_plugins_play to load vars for managed-node1 41016 1727204212.15114: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204212.15120: Calling groups_plugins_play to load vars for managed-node1 41016 1727204212.16839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204212.17714: done with get_vars() 41016 1727204212.17729: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:56:52 -0400 (0:00:00.041) 0:00:35.854 ***** 41016 1727204212.17791: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 41016 1727204212.18023: worker is 1 (out of 1 available) 41016 1727204212.18035: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 41016 1727204212.18047: done queuing things up, now waiting for results queue to drain 41016 1727204212.18048: waiting for pending results... 41016 1727204212.18233: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 41016 1727204212.18322: in run() - task 028d2410-947f-12d5-0ec4-000000000651 41016 1727204212.18333: variable 'ansible_search_path' from source: unknown 41016 1727204212.18337: variable 'ansible_search_path' from source: unknown 41016 1727204212.18365: calling self._execute() 41016 1727204212.18438: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204212.18455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204212.18459: variable 'omit' from source: magic vars 41016 1727204212.18891: variable 'ansible_distribution_major_version' from source: facts 41016 1727204212.18895: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204212.18900: variable 'omit' from source: magic vars 41016 1727204212.18966: variable 'omit' from source: magic vars 41016 1727204212.19182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 41016 1727204212.20921: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 41016 1727204212.20967: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 41016 1727204212.20998: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 41016 1727204212.21030: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 41016 1727204212.21050: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 41016 1727204212.21113: variable 'network_provider' from source: set_fact 41016 1727204212.21211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 41016 1727204212.21232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 41016 1727204212.21249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 41016 1727204212.21277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 41016 1727204212.21288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 41016 1727204212.21364: variable 'omit' from source: magic vars 41016 1727204212.21464: variable 'omit' from source: magic vars 41016 1727204212.21553: variable 'network_connections' from source: include params 41016 1727204212.21564: variable 'interface0' from source: play vars 41016 1727204212.21632: variable 'interface0' from source: play vars 41016 1727204212.21640: variable 'interface1' from source: play vars 41016 1727204212.21881: variable 'interface1' from source: play vars 41016 1727204212.21884: variable 'omit' from source: magic vars 41016 1727204212.21887: variable '__lsr_ansible_managed' from source: task vars 41016 1727204212.21892: variable '__lsr_ansible_managed' from source: task vars 41016 1727204212.22456: Loaded config def from plugin (lookup/template) 41016 1727204212.22462: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 41016 1727204212.22497: File lookup term: get_ansible_managed.j2 41016 1727204212.22500: variable 'ansible_search_path' from source: unknown 41016 1727204212.22503: evaluation_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 41016 1727204212.22521: search_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 41016 1727204212.22536: variable 'ansible_search_path' from source: unknown 41016 1727204212.26806: variable 'ansible_managed' from source: unknown 41016 1727204212.26890: variable 'omit' from source: magic vars 41016 1727204212.26910: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204212.26933: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204212.26948: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204212.26961: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204212.26969: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204212.26995: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204212.26998: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204212.27001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204212.27065: Set connection var ansible_shell_executable to /bin/sh 41016 1727204212.27069: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204212.27074: Set connection var ansible_shell_type to sh 41016 1727204212.27081: Set connection var ansible_timeout to 10 41016 1727204212.27092: Set connection var ansible_pipelining to False 41016 1727204212.27094: Set connection var ansible_connection to ssh 41016 1727204212.27114: variable 'ansible_shell_executable' from source: unknown 41016 1727204212.27116: variable 'ansible_connection' from source: unknown 41016 1727204212.27119: variable 'ansible_module_compression' from source: unknown 41016 1727204212.27122: variable 'ansible_shell_type' from source: unknown 41016 1727204212.27125: variable 'ansible_shell_executable' from source: unknown 41016 1727204212.27127: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204212.27131: variable 'ansible_pipelining' from source: unknown 41016 1727204212.27133: variable 'ansible_timeout' from source: unknown 41016 1727204212.27138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204212.27233: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 41016 1727204212.27244: variable 'omit' from source: magic vars 41016 1727204212.27247: starting attempt loop 41016 1727204212.27249: running the handler 41016 1727204212.27260: _low_level_execute_command(): starting 41016 1727204212.27268: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204212.27736: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204212.27773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204212.27778: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204212.27781: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204212.27783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 41016 1727204212.27785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204212.27823: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204212.27826: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204212.27837: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204212.27934: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204212.29708: stdout chunk (state=3): >>>/root <<< 41016 1727204212.29804: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204212.29836: stderr chunk (state=3): >>><<< 41016 1727204212.29839: stdout chunk (state=3): >>><<< 41016 1727204212.29857: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204212.29867: _low_level_execute_command(): starting 41016 1727204212.29873: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204212.2985716-43292-236104330062861 `" && echo ansible-tmp-1727204212.2985716-43292-236104330062861="` echo /root/.ansible/tmp/ansible-tmp-1727204212.2985716-43292-236104330062861 `" ) && sleep 0' 41016 1727204212.30311: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204212.30315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 41016 1727204212.30317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204212.30319: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204212.30321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204212.30364: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204212.30384: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204212.30471: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204212.32589: stdout chunk (state=3): >>>ansible-tmp-1727204212.2985716-43292-236104330062861=/root/.ansible/tmp/ansible-tmp-1727204212.2985716-43292-236104330062861 <<< 41016 1727204212.32677: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204212.32706: stderr chunk (state=3): >>><<< 41016 1727204212.32712: stdout chunk (state=3): >>><<< 41016 1727204212.32720: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204212.2985716-43292-236104330062861=/root/.ansible/tmp/ansible-tmp-1727204212.2985716-43292-236104330062861 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204212.32762: variable 'ansible_module_compression' from source: unknown 41016 1727204212.32798: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-410168h8uvyln/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 41016 1727204212.32823: variable 'ansible_facts' from source: unknown 41016 1727204212.32892: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204212.2985716-43292-236104330062861/AnsiballZ_network_connections.py 41016 1727204212.32988: Sending initial data 41016 1727204212.32991: Sent initial data (168 bytes) 41016 1727204212.33642: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204212.33728: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204212.35564: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204212.35781: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204212.35785: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmp0b9b04a8 /root/.ansible/tmp/ansible-tmp-1727204212.2985716-43292-236104330062861/AnsiballZ_network_connections.py <<< 41016 1727204212.35788: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204212.2985716-43292-236104330062861/AnsiballZ_network_connections.py" <<< 41016 1727204212.35798: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmp0b9b04a8" to remote "/root/.ansible/tmp/ansible-tmp-1727204212.2985716-43292-236104330062861/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204212.2985716-43292-236104330062861/AnsiballZ_network_connections.py" <<< 41016 1727204212.37914: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204212.37919: stdout chunk (state=3): >>><<< 41016 1727204212.37924: stderr chunk (state=3): >>><<< 41016 1727204212.37955: done transferring module to remote 41016 1727204212.37966: _low_level_execute_command(): starting 41016 1727204212.37972: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204212.2985716-43292-236104330062861/ /root/.ansible/tmp/ansible-tmp-1727204212.2985716-43292-236104330062861/AnsiballZ_network_connections.py && sleep 0' 41016 1727204212.38625: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204212.38646: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204212.38690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204212.38750: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204212.38771: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204212.38790: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204212.38889: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204212.41083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204212.41086: stdout chunk (state=3): >>><<< 41016 1727204212.41089: stderr chunk (state=3): >>><<< 41016 1727204212.41092: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204212.41094: _low_level_execute_command(): starting 41016 1727204212.41096: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204212.2985716-43292-236104330062861/AnsiballZ_network_connections.py && sleep 0' 41016 1727204212.41618: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204212.41626: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204212.41690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204212.41740: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204212.41756: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204212.41773: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204212.41894: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204212.84567: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_fyimvnna/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_fyimvnna/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/9a787842-0db5-45cc-82f4-1fb96e28cf45: error=unknown <<< 41016 1727204212.86325: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_fyimvnna/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_fyimvnna/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest1/af2476db-1e3b-4f5e-ab84-23db91da8d4b: error=unknown <<< 41016 1727204212.86522: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent", "state": "down"}, {"name": "ethtest1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent", "state": "down"}, {"name": "ethtest1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 41016 1727204212.88719: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 41016 1727204212.88754: stderr chunk (state=3): >>><<< 41016 1727204212.88757: stdout chunk (state=3): >>><<< 41016 1727204212.88773: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_fyimvnna/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_fyimvnna/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/9a787842-0db5-45cc-82f4-1fb96e28cf45: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_fyimvnna/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_fyimvnna/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest1/af2476db-1e3b-4f5e-ab84-23db91da8d4b: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent", "state": "down"}, {"name": "ethtest1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent", "state": "down"}, {"name": "ethtest1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204212.88808: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'ethtest1', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204212.2985716-43292-236104330062861/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204212.88816: _low_level_execute_command(): starting 41016 1727204212.88821: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204212.2985716-43292-236104330062861/ > /dev/null 2>&1 && sleep 0' 41016 1727204212.89266: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204212.89299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204212.89303: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204212.89306: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204212.89360: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204212.89364: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204212.89373: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204212.89450: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204212.91445: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204212.91451: stderr chunk (state=3): >>><<< 41016 1727204212.91454: stdout chunk (state=3): >>><<< 41016 1727204212.91469: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204212.91475: handler run complete 41016 1727204212.91498: attempt loop complete, returning result 41016 1727204212.91501: _execute() done 41016 1727204212.91504: dumping result to json 41016 1727204212.91509: done dumping result, returning 41016 1727204212.91519: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [028d2410-947f-12d5-0ec4-000000000651] 41016 1727204212.91521: sending task result for task 028d2410-947f-12d5-0ec4-000000000651 41016 1727204212.91624: done sending task result for task 028d2410-947f-12d5-0ec4-000000000651 41016 1727204212.91628: WORKER PROCESS EXITING changed: [managed-node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent", "state": "down" }, { "name": "ethtest1", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 41016 1727204212.91743: no more pending results, returning what we have 41016 1727204212.91747: results queue empty 41016 1727204212.91748: checking for any_errors_fatal 41016 1727204212.91756: done checking for any_errors_fatal 41016 1727204212.91757: checking for max_fail_percentage 41016 1727204212.91759: done checking for max_fail_percentage 41016 1727204212.91760: checking to see if all hosts have failed and the running result is not ok 41016 1727204212.91761: done checking to see if all hosts have failed 41016 1727204212.91761: getting the remaining hosts for this loop 41016 1727204212.91763: done getting the remaining hosts for this loop 41016 1727204212.91766: getting the next task for host managed-node1 41016 1727204212.91773: done getting next task for host managed-node1 41016 1727204212.91778: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 41016 1727204212.91781: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204212.91791: getting variables 41016 1727204212.91793: in VariableManager get_vars() 41016 1727204212.91833: Calling all_inventory to load vars for managed-node1 41016 1727204212.91836: Calling groups_inventory to load vars for managed-node1 41016 1727204212.91838: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204212.91846: Calling all_plugins_play to load vars for managed-node1 41016 1727204212.91849: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204212.91851: Calling groups_plugins_play to load vars for managed-node1 41016 1727204212.92807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204212.93667: done with get_vars() 41016 1727204212.93687: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:56:52 -0400 (0:00:00.759) 0:00:36.613 ***** 41016 1727204212.93751: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_state 41016 1727204212.94017: worker is 1 (out of 1 available) 41016 1727204212.94030: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_state 41016 1727204212.94041: done queuing things up, now waiting for results queue to drain 41016 1727204212.94042: waiting for pending results... 41016 1727204212.94236: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state 41016 1727204212.94336: in run() - task 028d2410-947f-12d5-0ec4-000000000652 41016 1727204212.94347: variable 'ansible_search_path' from source: unknown 41016 1727204212.94351: variable 'ansible_search_path' from source: unknown 41016 1727204212.94384: calling self._execute() 41016 1727204212.94453: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204212.94457: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204212.94465: variable 'omit' from source: magic vars 41016 1727204212.94757: variable 'ansible_distribution_major_version' from source: facts 41016 1727204212.94765: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204212.94853: variable 'network_state' from source: role '' defaults 41016 1727204212.94861: Evaluated conditional (network_state != {}): False 41016 1727204212.94864: when evaluation is False, skipping this task 41016 1727204212.94867: _execute() done 41016 1727204212.94869: dumping result to json 41016 1727204212.94874: done dumping result, returning 41016 1727204212.94881: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state [028d2410-947f-12d5-0ec4-000000000652] 41016 1727204212.94887: sending task result for task 028d2410-947f-12d5-0ec4-000000000652 41016 1727204212.94969: done sending task result for task 028d2410-947f-12d5-0ec4-000000000652 41016 1727204212.94972: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 41016 1727204212.95024: no more pending results, returning what we have 41016 1727204212.95029: results queue empty 41016 1727204212.95030: checking for any_errors_fatal 41016 1727204212.95043: done checking for any_errors_fatal 41016 1727204212.95044: checking for max_fail_percentage 41016 1727204212.95045: done checking for max_fail_percentage 41016 1727204212.95046: checking to see if all hosts have failed and the running result is not ok 41016 1727204212.95047: done checking to see if all hosts have failed 41016 1727204212.95048: getting the remaining hosts for this loop 41016 1727204212.95049: done getting the remaining hosts for this loop 41016 1727204212.95053: getting the next task for host managed-node1 41016 1727204212.95060: done getting next task for host managed-node1 41016 1727204212.95063: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 41016 1727204212.95068: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204212.95095: getting variables 41016 1727204212.95096: in VariableManager get_vars() 41016 1727204212.95137: Calling all_inventory to load vars for managed-node1 41016 1727204212.95139: Calling groups_inventory to load vars for managed-node1 41016 1727204212.95141: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204212.95150: Calling all_plugins_play to load vars for managed-node1 41016 1727204212.95152: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204212.95155: Calling groups_plugins_play to load vars for managed-node1 41016 1727204212.95946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204212.96818: done with get_vars() 41016 1727204212.96837: done getting variables 41016 1727204212.96881: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:56:52 -0400 (0:00:00.031) 0:00:36.645 ***** 41016 1727204212.96906: entering _queue_task() for managed-node1/debug 41016 1727204212.97163: worker is 1 (out of 1 available) 41016 1727204212.97178: exiting _queue_task() for managed-node1/debug 41016 1727204212.97190: done queuing things up, now waiting for results queue to drain 41016 1727204212.97191: waiting for pending results... 41016 1727204212.97384: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 41016 1727204212.97460: in run() - task 028d2410-947f-12d5-0ec4-000000000653 41016 1727204212.97471: variable 'ansible_search_path' from source: unknown 41016 1727204212.97477: variable 'ansible_search_path' from source: unknown 41016 1727204212.97506: calling self._execute() 41016 1727204212.97580: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204212.97586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204212.97594: variable 'omit' from source: magic vars 41016 1727204212.97872: variable 'ansible_distribution_major_version' from source: facts 41016 1727204212.97883: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204212.97888: variable 'omit' from source: magic vars 41016 1727204212.97928: variable 'omit' from source: magic vars 41016 1727204212.97950: variable 'omit' from source: magic vars 41016 1727204212.97987: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204212.98015: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204212.98029: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204212.98042: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204212.98052: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204212.98077: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204212.98081: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204212.98083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204212.98151: Set connection var ansible_shell_executable to /bin/sh 41016 1727204212.98154: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204212.98160: Set connection var ansible_shell_type to sh 41016 1727204212.98164: Set connection var ansible_timeout to 10 41016 1727204212.98170: Set connection var ansible_pipelining to False 41016 1727204212.98186: Set connection var ansible_connection to ssh 41016 1727204212.98197: variable 'ansible_shell_executable' from source: unknown 41016 1727204212.98199: variable 'ansible_connection' from source: unknown 41016 1727204212.98202: variable 'ansible_module_compression' from source: unknown 41016 1727204212.98204: variable 'ansible_shell_type' from source: unknown 41016 1727204212.98206: variable 'ansible_shell_executable' from source: unknown 41016 1727204212.98208: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204212.98213: variable 'ansible_pipelining' from source: unknown 41016 1727204212.98215: variable 'ansible_timeout' from source: unknown 41016 1727204212.98217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204212.98323: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204212.98332: variable 'omit' from source: magic vars 41016 1727204212.98338: starting attempt loop 41016 1727204212.98340: running the handler 41016 1727204212.98434: variable '__network_connections_result' from source: set_fact 41016 1727204212.98472: handler run complete 41016 1727204212.98487: attempt loop complete, returning result 41016 1727204212.98490: _execute() done 41016 1727204212.98493: dumping result to json 41016 1727204212.98495: done dumping result, returning 41016 1727204212.98504: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [028d2410-947f-12d5-0ec4-000000000653] 41016 1727204212.98507: sending task result for task 028d2410-947f-12d5-0ec4-000000000653 41016 1727204212.98589: done sending task result for task 028d2410-947f-12d5-0ec4-000000000653 41016 1727204212.98592: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result.stderr_lines": [ "" ] } 41016 1727204212.98682: no more pending results, returning what we have 41016 1727204212.98685: results queue empty 41016 1727204212.98686: checking for any_errors_fatal 41016 1727204212.98692: done checking for any_errors_fatal 41016 1727204212.98693: checking for max_fail_percentage 41016 1727204212.98694: done checking for max_fail_percentage 41016 1727204212.98695: checking to see if all hosts have failed and the running result is not ok 41016 1727204212.98696: done checking to see if all hosts have failed 41016 1727204212.98696: getting the remaining hosts for this loop 41016 1727204212.98698: done getting the remaining hosts for this loop 41016 1727204212.98701: getting the next task for host managed-node1 41016 1727204212.98708: done getting next task for host managed-node1 41016 1727204212.98714: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 41016 1727204212.98718: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204212.98728: getting variables 41016 1727204212.98730: in VariableManager get_vars() 41016 1727204212.98764: Calling all_inventory to load vars for managed-node1 41016 1727204212.98766: Calling groups_inventory to load vars for managed-node1 41016 1727204212.98768: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204212.98783: Calling all_plugins_play to load vars for managed-node1 41016 1727204212.98786: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204212.98789: Calling groups_plugins_play to load vars for managed-node1 41016 1727204212.99665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204213.00529: done with get_vars() 41016 1727204213.00544: done getting variables 41016 1727204213.00587: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:56:53 -0400 (0:00:00.037) 0:00:36.682 ***** 41016 1727204213.00608: entering _queue_task() for managed-node1/debug 41016 1727204213.00844: worker is 1 (out of 1 available) 41016 1727204213.00857: exiting _queue_task() for managed-node1/debug 41016 1727204213.00870: done queuing things up, now waiting for results queue to drain 41016 1727204213.00871: waiting for pending results... 41016 1727204213.01050: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 41016 1727204213.01141: in run() - task 028d2410-947f-12d5-0ec4-000000000654 41016 1727204213.01152: variable 'ansible_search_path' from source: unknown 41016 1727204213.01156: variable 'ansible_search_path' from source: unknown 41016 1727204213.01184: calling self._execute() 41016 1727204213.01258: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204213.01261: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204213.01269: variable 'omit' from source: magic vars 41016 1727204213.01546: variable 'ansible_distribution_major_version' from source: facts 41016 1727204213.01555: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204213.01561: variable 'omit' from source: magic vars 41016 1727204213.01599: variable 'omit' from source: magic vars 41016 1727204213.01623: variable 'omit' from source: magic vars 41016 1727204213.01656: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204213.01683: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204213.01698: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204213.01713: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204213.01721: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204213.01743: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204213.01746: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204213.01750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204213.01820: Set connection var ansible_shell_executable to /bin/sh 41016 1727204213.01824: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204213.01830: Set connection var ansible_shell_type to sh 41016 1727204213.01835: Set connection var ansible_timeout to 10 41016 1727204213.01840: Set connection var ansible_pipelining to False 41016 1727204213.01846: Set connection var ansible_connection to ssh 41016 1727204213.01866: variable 'ansible_shell_executable' from source: unknown 41016 1727204213.01869: variable 'ansible_connection' from source: unknown 41016 1727204213.01872: variable 'ansible_module_compression' from source: unknown 41016 1727204213.01874: variable 'ansible_shell_type' from source: unknown 41016 1727204213.01878: variable 'ansible_shell_executable' from source: unknown 41016 1727204213.01881: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204213.01883: variable 'ansible_pipelining' from source: unknown 41016 1727204213.01885: variable 'ansible_timeout' from source: unknown 41016 1727204213.01887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204213.01986: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204213.01996: variable 'omit' from source: magic vars 41016 1727204213.02002: starting attempt loop 41016 1727204213.02005: running the handler 41016 1727204213.02045: variable '__network_connections_result' from source: set_fact 41016 1727204213.02102: variable '__network_connections_result' from source: set_fact 41016 1727204213.02181: handler run complete 41016 1727204213.02201: attempt loop complete, returning result 41016 1727204213.02204: _execute() done 41016 1727204213.02207: dumping result to json 41016 1727204213.02215: done dumping result, returning 41016 1727204213.02218: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [028d2410-947f-12d5-0ec4-000000000654] 41016 1727204213.02223: sending task result for task 028d2410-947f-12d5-0ec4-000000000654 41016 1727204213.02306: done sending task result for task 028d2410-947f-12d5-0ec4-000000000654 41016 1727204213.02309: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent", "state": "down" }, { "name": "ethtest1", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 41016 1727204213.02401: no more pending results, returning what we have 41016 1727204213.02404: results queue empty 41016 1727204213.02405: checking for any_errors_fatal 41016 1727204213.02412: done checking for any_errors_fatal 41016 1727204213.02413: checking for max_fail_percentage 41016 1727204213.02414: done checking for max_fail_percentage 41016 1727204213.02415: checking to see if all hosts have failed and the running result is not ok 41016 1727204213.02416: done checking to see if all hosts have failed 41016 1727204213.02416: getting the remaining hosts for this loop 41016 1727204213.02418: done getting the remaining hosts for this loop 41016 1727204213.02422: getting the next task for host managed-node1 41016 1727204213.02429: done getting next task for host managed-node1 41016 1727204213.02432: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 41016 1727204213.02436: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204213.02446: getting variables 41016 1727204213.02447: in VariableManager get_vars() 41016 1727204213.02487: Calling all_inventory to load vars for managed-node1 41016 1727204213.02490: Calling groups_inventory to load vars for managed-node1 41016 1727204213.02492: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204213.02500: Calling all_plugins_play to load vars for managed-node1 41016 1727204213.02502: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204213.02505: Calling groups_plugins_play to load vars for managed-node1 41016 1727204213.03264: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204213.04229: done with get_vars() 41016 1727204213.04244: done getting variables 41016 1727204213.04286: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:56:53 -0400 (0:00:00.036) 0:00:36.719 ***** 41016 1727204213.04307: entering _queue_task() for managed-node1/debug 41016 1727204213.04531: worker is 1 (out of 1 available) 41016 1727204213.04559: exiting _queue_task() for managed-node1/debug 41016 1727204213.04573: done queuing things up, now waiting for results queue to drain 41016 1727204213.04574: waiting for pending results... 41016 1727204213.04747: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 41016 1727204213.04834: in run() - task 028d2410-947f-12d5-0ec4-000000000655 41016 1727204213.04845: variable 'ansible_search_path' from source: unknown 41016 1727204213.04848: variable 'ansible_search_path' from source: unknown 41016 1727204213.04874: calling self._execute() 41016 1727204213.04949: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204213.04954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204213.04961: variable 'omit' from source: magic vars 41016 1727204213.05227: variable 'ansible_distribution_major_version' from source: facts 41016 1727204213.05238: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204213.05317: variable 'network_state' from source: role '' defaults 41016 1727204213.05326: Evaluated conditional (network_state != {}): False 41016 1727204213.05329: when evaluation is False, skipping this task 41016 1727204213.05332: _execute() done 41016 1727204213.05335: dumping result to json 41016 1727204213.05337: done dumping result, returning 41016 1727204213.05346: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [028d2410-947f-12d5-0ec4-000000000655] 41016 1727204213.05351: sending task result for task 028d2410-947f-12d5-0ec4-000000000655 41016 1727204213.05430: done sending task result for task 028d2410-947f-12d5-0ec4-000000000655 41016 1727204213.05433: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "network_state != {}" } 41016 1727204213.05493: no more pending results, returning what we have 41016 1727204213.05497: results queue empty 41016 1727204213.05498: checking for any_errors_fatal 41016 1727204213.05505: done checking for any_errors_fatal 41016 1727204213.05505: checking for max_fail_percentage 41016 1727204213.05507: done checking for max_fail_percentage 41016 1727204213.05508: checking to see if all hosts have failed and the running result is not ok 41016 1727204213.05509: done checking to see if all hosts have failed 41016 1727204213.05512: getting the remaining hosts for this loop 41016 1727204213.05513: done getting the remaining hosts for this loop 41016 1727204213.05517: getting the next task for host managed-node1 41016 1727204213.05523: done getting next task for host managed-node1 41016 1727204213.05526: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 41016 1727204213.05529: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204213.05544: getting variables 41016 1727204213.05546: in VariableManager get_vars() 41016 1727204213.05581: Calling all_inventory to load vars for managed-node1 41016 1727204213.05584: Calling groups_inventory to load vars for managed-node1 41016 1727204213.05586: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204213.05594: Calling all_plugins_play to load vars for managed-node1 41016 1727204213.05596: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204213.05599: Calling groups_plugins_play to load vars for managed-node1 41016 1727204213.06336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204213.07198: done with get_vars() 41016 1727204213.07215: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:56:53 -0400 (0:00:00.029) 0:00:36.748 ***** 41016 1727204213.07283: entering _queue_task() for managed-node1/ping 41016 1727204213.07498: worker is 1 (out of 1 available) 41016 1727204213.07515: exiting _queue_task() for managed-node1/ping 41016 1727204213.07527: done queuing things up, now waiting for results queue to drain 41016 1727204213.07529: waiting for pending results... 41016 1727204213.07704: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 41016 1727204213.07779: in run() - task 028d2410-947f-12d5-0ec4-000000000656 41016 1727204213.07789: variable 'ansible_search_path' from source: unknown 41016 1727204213.07792: variable 'ansible_search_path' from source: unknown 41016 1727204213.07821: calling self._execute() 41016 1727204213.07895: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204213.07898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204213.07908: variable 'omit' from source: magic vars 41016 1727204213.08174: variable 'ansible_distribution_major_version' from source: facts 41016 1727204213.08186: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204213.08190: variable 'omit' from source: magic vars 41016 1727204213.08227: variable 'omit' from source: magic vars 41016 1727204213.08249: variable 'omit' from source: magic vars 41016 1727204213.08280: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204213.08312: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204213.08324: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204213.08337: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204213.08346: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204213.08368: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204213.08371: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204213.08374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204213.08444: Set connection var ansible_shell_executable to /bin/sh 41016 1727204213.08447: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204213.08453: Set connection var ansible_shell_type to sh 41016 1727204213.08458: Set connection var ansible_timeout to 10 41016 1727204213.08464: Set connection var ansible_pipelining to False 41016 1727204213.08470: Set connection var ansible_connection to ssh 41016 1727204213.08488: variable 'ansible_shell_executable' from source: unknown 41016 1727204213.08491: variable 'ansible_connection' from source: unknown 41016 1727204213.08494: variable 'ansible_module_compression' from source: unknown 41016 1727204213.08496: variable 'ansible_shell_type' from source: unknown 41016 1727204213.08498: variable 'ansible_shell_executable' from source: unknown 41016 1727204213.08500: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204213.08504: variable 'ansible_pipelining' from source: unknown 41016 1727204213.08506: variable 'ansible_timeout' from source: unknown 41016 1727204213.08513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204213.08651: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 41016 1727204213.08660: variable 'omit' from source: magic vars 41016 1727204213.08666: starting attempt loop 41016 1727204213.08668: running the handler 41016 1727204213.08683: _low_level_execute_command(): starting 41016 1727204213.08690: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204213.09179: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204213.09212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204213.09217: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204213.09219: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204213.09270: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204213.09273: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204213.09277: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204213.09363: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204213.11165: stdout chunk (state=3): >>>/root <<< 41016 1727204213.11266: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204213.11295: stderr chunk (state=3): >>><<< 41016 1727204213.11298: stdout chunk (state=3): >>><<< 41016 1727204213.11320: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204213.11333: _low_level_execute_command(): starting 41016 1727204213.11339: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204213.1131828-43332-157636065653130 `" && echo ansible-tmp-1727204213.1131828-43332-157636065653130="` echo /root/.ansible/tmp/ansible-tmp-1727204213.1131828-43332-157636065653130 `" ) && sleep 0' 41016 1727204213.11741: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204213.11778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204213.11782: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 41016 1727204213.11784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204213.11794: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204213.11797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204213.11838: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204213.11842: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204213.11846: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204213.11924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204213.13998: stdout chunk (state=3): >>>ansible-tmp-1727204213.1131828-43332-157636065653130=/root/.ansible/tmp/ansible-tmp-1727204213.1131828-43332-157636065653130 <<< 41016 1727204213.14104: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204213.14134: stderr chunk (state=3): >>><<< 41016 1727204213.14137: stdout chunk (state=3): >>><<< 41016 1727204213.14151: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204213.1131828-43332-157636065653130=/root/.ansible/tmp/ansible-tmp-1727204213.1131828-43332-157636065653130 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204213.14228: variable 'ansible_module_compression' from source: unknown 41016 1727204213.14406: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-410168h8uvyln/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 41016 1727204213.14412: variable 'ansible_facts' from source: unknown 41016 1727204213.14415: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204213.1131828-43332-157636065653130/AnsiballZ_ping.py 41016 1727204213.14556: Sending initial data 41016 1727204213.14559: Sent initial data (153 bytes) 41016 1727204213.15031: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204213.15034: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204213.15043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204213.15061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204213.15144: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204213.15161: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204213.15263: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204213.17347: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 41016 1727204213.17382: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204213.17678: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204213.17694: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmp55v04ndk /root/.ansible/tmp/ansible-tmp-1727204213.1131828-43332-157636065653130/AnsiballZ_ping.py <<< 41016 1727204213.17697: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204213.1131828-43332-157636065653130/AnsiballZ_ping.py" <<< 41016 1727204213.17755: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmp55v04ndk" to remote "/root/.ansible/tmp/ansible-tmp-1727204213.1131828-43332-157636065653130/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204213.1131828-43332-157636065653130/AnsiballZ_ping.py" <<< 41016 1727204213.18561: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204213.18637: stderr chunk (state=3): >>><<< 41016 1727204213.18646: stdout chunk (state=3): >>><<< 41016 1727204213.18715: done transferring module to remote 41016 1727204213.18803: _low_level_execute_command(): starting 41016 1727204213.18808: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204213.1131828-43332-157636065653130/ /root/.ansible/tmp/ansible-tmp-1727204213.1131828-43332-157636065653130/AnsiballZ_ping.py && sleep 0' 41016 1727204213.19377: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204213.19392: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204213.19485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204213.19522: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204213.19546: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204213.19559: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204213.19669: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204213.21683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204213.21698: stdout chunk (state=3): >>><<< 41016 1727204213.21709: stderr chunk (state=3): >>><<< 41016 1727204213.21732: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204213.21815: _low_level_execute_command(): starting 41016 1727204213.21819: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204213.1131828-43332-157636065653130/AnsiballZ_ping.py && sleep 0' 41016 1727204213.22377: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204213.22392: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204213.22445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204213.22517: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204213.22534: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204213.22564: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204213.22686: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204213.39274: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 41016 1727204213.40862: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 41016 1727204213.40866: stderr chunk (state=3): >>><<< 41016 1727204213.40883: stdout chunk (state=3): >>><<< 41016 1727204213.41020: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204213.41026: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204213.1131828-43332-157636065653130/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204213.41030: _low_level_execute_command(): starting 41016 1727204213.41033: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204213.1131828-43332-157636065653130/ > /dev/null 2>&1 && sleep 0' 41016 1727204213.42089: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204213.42154: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204213.42295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204213.42429: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204213.42499: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204213.42579: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204213.42747: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204213.44847: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204213.44851: stdout chunk (state=3): >>><<< 41016 1727204213.44853: stderr chunk (state=3): >>><<< 41016 1727204213.44872: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204213.44885: handler run complete 41016 1727204213.45281: attempt loop complete, returning result 41016 1727204213.45284: _execute() done 41016 1727204213.45289: dumping result to json 41016 1727204213.45292: done dumping result, returning 41016 1727204213.45295: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [028d2410-947f-12d5-0ec4-000000000656] 41016 1727204213.45298: sending task result for task 028d2410-947f-12d5-0ec4-000000000656 41016 1727204213.45370: done sending task result for task 028d2410-947f-12d5-0ec4-000000000656 41016 1727204213.45374: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "ping": "pong" } 41016 1727204213.45445: no more pending results, returning what we have 41016 1727204213.45449: results queue empty 41016 1727204213.45451: checking for any_errors_fatal 41016 1727204213.45456: done checking for any_errors_fatal 41016 1727204213.45457: checking for max_fail_percentage 41016 1727204213.45459: done checking for max_fail_percentage 41016 1727204213.45460: checking to see if all hosts have failed and the running result is not ok 41016 1727204213.45461: done checking to see if all hosts have failed 41016 1727204213.45462: getting the remaining hosts for this loop 41016 1727204213.45463: done getting the remaining hosts for this loop 41016 1727204213.45467: getting the next task for host managed-node1 41016 1727204213.45488: done getting next task for host managed-node1 41016 1727204213.45491: ^ task is: TASK: meta (role_complete) 41016 1727204213.45495: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204213.45507: getting variables 41016 1727204213.45509: in VariableManager get_vars() 41016 1727204213.45554: Calling all_inventory to load vars for managed-node1 41016 1727204213.45556: Calling groups_inventory to load vars for managed-node1 41016 1727204213.45559: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204213.45568: Calling all_plugins_play to load vars for managed-node1 41016 1727204213.45571: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204213.45573: Calling groups_plugins_play to load vars for managed-node1 41016 1727204213.48723: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204213.51124: done with get_vars() 41016 1727204213.51150: done getting variables 41016 1727204213.51236: done queuing things up, now waiting for results queue to drain 41016 1727204213.51238: results queue empty 41016 1727204213.51239: checking for any_errors_fatal 41016 1727204213.51242: done checking for any_errors_fatal 41016 1727204213.51242: checking for max_fail_percentage 41016 1727204213.51243: done checking for max_fail_percentage 41016 1727204213.51244: checking to see if all hosts have failed and the running result is not ok 41016 1727204213.51245: done checking to see if all hosts have failed 41016 1727204213.51246: getting the remaining hosts for this loop 41016 1727204213.51247: done getting the remaining hosts for this loop 41016 1727204213.51249: getting the next task for host managed-node1 41016 1727204213.51254: done getting next task for host managed-node1 41016 1727204213.51257: ^ task is: TASK: Delete interface1 41016 1727204213.51259: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204213.51262: getting variables 41016 1727204213.51263: in VariableManager get_vars() 41016 1727204213.51279: Calling all_inventory to load vars for managed-node1 41016 1727204213.51281: Calling groups_inventory to load vars for managed-node1 41016 1727204213.51283: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204213.51288: Calling all_plugins_play to load vars for managed-node1 41016 1727204213.51290: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204213.51293: Calling groups_plugins_play to load vars for managed-node1 41016 1727204213.52452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204213.54025: done with get_vars() 41016 1727204213.54046: done getting variables TASK [Delete interface1] ******************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:151 Tuesday 24 September 2024 14:56:53 -0400 (0:00:00.468) 0:00:37.217 ***** 41016 1727204213.54120: entering _queue_task() for managed-node1/include_tasks 41016 1727204213.54467: worker is 1 (out of 1 available) 41016 1727204213.54480: exiting _queue_task() for managed-node1/include_tasks 41016 1727204213.54492: done queuing things up, now waiting for results queue to drain 41016 1727204213.54493: waiting for pending results... 41016 1727204213.55093: running TaskExecutor() for managed-node1/TASK: Delete interface1 41016 1727204213.55098: in run() - task 028d2410-947f-12d5-0ec4-0000000000b5 41016 1727204213.55102: variable 'ansible_search_path' from source: unknown 41016 1727204213.55181: calling self._execute() 41016 1727204213.55185: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204213.55188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204213.55191: variable 'omit' from source: magic vars 41016 1727204213.55433: variable 'ansible_distribution_major_version' from source: facts 41016 1727204213.55448: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204213.55680: _execute() done 41016 1727204213.55683: dumping result to json 41016 1727204213.55685: done dumping result, returning 41016 1727204213.55687: done running TaskExecutor() for managed-node1/TASK: Delete interface1 [028d2410-947f-12d5-0ec4-0000000000b5] 41016 1727204213.55689: sending task result for task 028d2410-947f-12d5-0ec4-0000000000b5 41016 1727204213.55754: done sending task result for task 028d2410-947f-12d5-0ec4-0000000000b5 41016 1727204213.55757: WORKER PROCESS EXITING 41016 1727204213.55785: no more pending results, returning what we have 41016 1727204213.55789: in VariableManager get_vars() 41016 1727204213.55829: Calling all_inventory to load vars for managed-node1 41016 1727204213.55831: Calling groups_inventory to load vars for managed-node1 41016 1727204213.55833: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204213.55842: Calling all_plugins_play to load vars for managed-node1 41016 1727204213.55844: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204213.55846: Calling groups_plugins_play to load vars for managed-node1 41016 1727204213.58533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204213.60082: done with get_vars() 41016 1727204213.60102: variable 'ansible_search_path' from source: unknown 41016 1727204213.60119: we have included files to process 41016 1727204213.60120: generating all_blocks data 41016 1727204213.60122: done generating all_blocks data 41016 1727204213.60126: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 41016 1727204213.60127: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 41016 1727204213.60129: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 41016 1727204213.60362: done processing included file 41016 1727204213.60364: iterating over new_blocks loaded from include file 41016 1727204213.60365: in VariableManager get_vars() 41016 1727204213.60389: done with get_vars() 41016 1727204213.60391: filtering new block on tags 41016 1727204213.60419: done filtering new block on tags 41016 1727204213.60422: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed-node1 41016 1727204213.60427: extending task lists for all hosts with included blocks 41016 1727204213.61686: done extending task lists 41016 1727204213.61688: done processing included files 41016 1727204213.61689: results queue empty 41016 1727204213.61690: checking for any_errors_fatal 41016 1727204213.61691: done checking for any_errors_fatal 41016 1727204213.61692: checking for max_fail_percentage 41016 1727204213.61693: done checking for max_fail_percentage 41016 1727204213.61694: checking to see if all hosts have failed and the running result is not ok 41016 1727204213.61694: done checking to see if all hosts have failed 41016 1727204213.61695: getting the remaining hosts for this loop 41016 1727204213.61696: done getting the remaining hosts for this loop 41016 1727204213.61699: getting the next task for host managed-node1 41016 1727204213.61703: done getting next task for host managed-node1 41016 1727204213.61705: ^ task is: TASK: Remove test interface if necessary 41016 1727204213.61708: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204213.61713: getting variables 41016 1727204213.61714: in VariableManager get_vars() 41016 1727204213.61728: Calling all_inventory to load vars for managed-node1 41016 1727204213.61730: Calling groups_inventory to load vars for managed-node1 41016 1727204213.61732: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204213.61737: Calling all_plugins_play to load vars for managed-node1 41016 1727204213.61740: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204213.61742: Calling groups_plugins_play to load vars for managed-node1 41016 1727204213.62934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204213.64569: done with get_vars() 41016 1727204213.64591: done getting variables 41016 1727204213.64634: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Tuesday 24 September 2024 14:56:53 -0400 (0:00:00.105) 0:00:37.322 ***** 41016 1727204213.64662: entering _queue_task() for managed-node1/command 41016 1727204213.65414: worker is 1 (out of 1 available) 41016 1727204213.65423: exiting _queue_task() for managed-node1/command 41016 1727204213.65433: done queuing things up, now waiting for results queue to drain 41016 1727204213.65434: waiting for pending results... 41016 1727204213.65733: running TaskExecutor() for managed-node1/TASK: Remove test interface if necessary 41016 1727204213.65947: in run() - task 028d2410-947f-12d5-0ec4-000000000777 41016 1727204213.66089: variable 'ansible_search_path' from source: unknown 41016 1727204213.66093: variable 'ansible_search_path' from source: unknown 41016 1727204213.66129: calling self._execute() 41016 1727204213.66337: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204213.66343: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204213.66352: variable 'omit' from source: magic vars 41016 1727204213.67139: variable 'ansible_distribution_major_version' from source: facts 41016 1727204213.67152: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204213.67167: variable 'omit' from source: magic vars 41016 1727204213.67274: variable 'omit' from source: magic vars 41016 1727204213.67881: variable 'interface' from source: set_fact 41016 1727204213.67884: variable 'omit' from source: magic vars 41016 1727204213.67886: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204213.67888: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204213.67890: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204213.67945: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204213.67955: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204213.68031: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204213.68380: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204213.68383: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204213.68386: Set connection var ansible_shell_executable to /bin/sh 41016 1727204213.68388: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204213.68390: Set connection var ansible_shell_type to sh 41016 1727204213.68392: Set connection var ansible_timeout to 10 41016 1727204213.68394: Set connection var ansible_pipelining to False 41016 1727204213.68396: Set connection var ansible_connection to ssh 41016 1727204213.68398: variable 'ansible_shell_executable' from source: unknown 41016 1727204213.68401: variable 'ansible_connection' from source: unknown 41016 1727204213.68403: variable 'ansible_module_compression' from source: unknown 41016 1727204213.68405: variable 'ansible_shell_type' from source: unknown 41016 1727204213.68407: variable 'ansible_shell_executable' from source: unknown 41016 1727204213.68409: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204213.68411: variable 'ansible_pipelining' from source: unknown 41016 1727204213.68413: variable 'ansible_timeout' from source: unknown 41016 1727204213.68415: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204213.68419: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204213.68421: variable 'omit' from source: magic vars 41016 1727204213.68423: starting attempt loop 41016 1727204213.68426: running the handler 41016 1727204213.68428: _low_level_execute_command(): starting 41016 1727204213.68430: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204213.69281: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204213.69287: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204213.69291: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204213.69294: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204213.69296: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204213.69298: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204213.69368: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204213.71195: stdout chunk (state=3): >>>/root <<< 41016 1727204213.71408: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204213.71417: stdout chunk (state=3): >>><<< 41016 1727204213.71426: stderr chunk (state=3): >>><<< 41016 1727204213.71450: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204213.71464: _low_level_execute_command(): starting 41016 1727204213.71471: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204213.7145035-43411-123180081496258 `" && echo ansible-tmp-1727204213.7145035-43411-123180081496258="` echo /root/.ansible/tmp/ansible-tmp-1727204213.7145035-43411-123180081496258 `" ) && sleep 0' 41016 1727204213.72712: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204213.72721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204213.72902: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204213.73092: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204213.73204: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204213.75327: stdout chunk (state=3): >>>ansible-tmp-1727204213.7145035-43411-123180081496258=/root/.ansible/tmp/ansible-tmp-1727204213.7145035-43411-123180081496258 <<< 41016 1727204213.75546: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204213.75596: stderr chunk (state=3): >>><<< 41016 1727204213.75599: stdout chunk (state=3): >>><<< 41016 1727204213.75625: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204213.7145035-43411-123180081496258=/root/.ansible/tmp/ansible-tmp-1727204213.7145035-43411-123180081496258 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204213.75657: variable 'ansible_module_compression' from source: unknown 41016 1727204213.75837: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-410168h8uvyln/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41016 1727204213.75870: variable 'ansible_facts' from source: unknown 41016 1727204213.76284: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204213.7145035-43411-123180081496258/AnsiballZ_command.py 41016 1727204213.76584: Sending initial data 41016 1727204213.76587: Sent initial data (156 bytes) 41016 1727204213.77823: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204213.77929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204213.78157: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204213.78164: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204213.78325: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204213.80202: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204213.80348: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204213.80394: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmpd6m4a_hd /root/.ansible/tmp/ansible-tmp-1727204213.7145035-43411-123180081496258/AnsiballZ_command.py <<< 41016 1727204213.80397: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204213.7145035-43411-123180081496258/AnsiballZ_command.py" <<< 41016 1727204213.80482: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmpd6m4a_hd" to remote "/root/.ansible/tmp/ansible-tmp-1727204213.7145035-43411-123180081496258/AnsiballZ_command.py" <<< 41016 1727204213.80488: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204213.7145035-43411-123180081496258/AnsiballZ_command.py" <<< 41016 1727204213.82177: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204213.82190: stdout chunk (state=3): >>><<< 41016 1727204213.82201: stderr chunk (state=3): >>><<< 41016 1727204213.82224: done transferring module to remote 41016 1727204213.82235: _low_level_execute_command(): starting 41016 1727204213.82239: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204213.7145035-43411-123180081496258/ /root/.ansible/tmp/ansible-tmp-1727204213.7145035-43411-123180081496258/AnsiballZ_command.py && sleep 0' 41016 1727204213.83164: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204213.83183: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204213.83286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204213.83334: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204213.83580: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204213.85530: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204213.85542: stdout chunk (state=3): >>><<< 41016 1727204213.85554: stderr chunk (state=3): >>><<< 41016 1727204213.85578: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204213.85592: _low_level_execute_command(): starting 41016 1727204213.85602: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204213.7145035-43411-123180081496258/AnsiballZ_command.py && sleep 0' 41016 1727204213.86236: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204213.86250: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204213.86263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204213.86283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204213.86301: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204213.86315: stderr chunk (state=3): >>>debug2: match not found <<< 41016 1727204213.86329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204213.86348: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41016 1727204213.86360: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 41016 1727204213.86444: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204213.86465: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204213.86591: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204214.04446: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest1"], "start": "2024-09-24 14:56:54.030392", "end": "2024-09-24 14:56:54.040971", "delta": "0:00:00.010579", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41016 1727204214.07242: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 41016 1727204214.07267: stdout chunk (state=3): >>><<< 41016 1727204214.07289: stderr chunk (state=3): >>><<< 41016 1727204214.07434: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest1"], "start": "2024-09-24 14:56:54.030392", "end": "2024-09-24 14:56:54.040971", "delta": "0:00:00.010579", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204214.07484: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del ethtest1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204213.7145035-43411-123180081496258/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204214.07608: _low_level_execute_command(): starting 41016 1727204214.07682: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204213.7145035-43411-123180081496258/ > /dev/null 2>&1 && sleep 0' 41016 1727204214.09446: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204214.09547: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204214.09601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204214.09629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204214.09922: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204214.09965: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204214.10005: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204214.10139: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204214.12148: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204214.12392: stderr chunk (state=3): >>><<< 41016 1727204214.12479: stdout chunk (state=3): >>><<< 41016 1727204214.12488: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204214.12502: handler run complete 41016 1727204214.12605: Evaluated conditional (False): False 41016 1727204214.12614: attempt loop complete, returning result 41016 1727204214.12621: _execute() done 41016 1727204214.12636: dumping result to json 41016 1727204214.12644: done dumping result, returning 41016 1727204214.12651: done running TaskExecutor() for managed-node1/TASK: Remove test interface if necessary [028d2410-947f-12d5-0ec4-000000000777] 41016 1727204214.12664: sending task result for task 028d2410-947f-12d5-0ec4-000000000777 41016 1727204214.13095: done sending task result for task 028d2410-947f-12d5-0ec4-000000000777 ok: [managed-node1] => { "changed": false, "cmd": [ "ip", "link", "del", "ethtest1" ], "delta": "0:00:00.010579", "end": "2024-09-24 14:56:54.040971", "rc": 0, "start": "2024-09-24 14:56:54.030392" } 41016 1727204214.13197: no more pending results, returning what we have 41016 1727204214.13202: results queue empty 41016 1727204214.13203: checking for any_errors_fatal 41016 1727204214.13205: done checking for any_errors_fatal 41016 1727204214.13206: checking for max_fail_percentage 41016 1727204214.13207: done checking for max_fail_percentage 41016 1727204214.13208: checking to see if all hosts have failed and the running result is not ok 41016 1727204214.13211: done checking to see if all hosts have failed 41016 1727204214.13212: getting the remaining hosts for this loop 41016 1727204214.13215: done getting the remaining hosts for this loop 41016 1727204214.13218: getting the next task for host managed-node1 41016 1727204214.13227: done getting next task for host managed-node1 41016 1727204214.13230: ^ task is: TASK: Assert interface1 is absent 41016 1727204214.13234: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204214.13239: getting variables 41016 1727204214.13241: in VariableManager get_vars() 41016 1727204214.13406: Calling all_inventory to load vars for managed-node1 41016 1727204214.13413: Calling groups_inventory to load vars for managed-node1 41016 1727204214.13416: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204214.13427: Calling all_plugins_play to load vars for managed-node1 41016 1727204214.13431: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204214.13434: Calling groups_plugins_play to load vars for managed-node1 41016 1727204214.26884: WORKER PROCESS EXITING 41016 1727204214.28770: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204214.32053: done with get_vars() 41016 1727204214.32082: done getting variables TASK [Assert interface1 is absent] ********************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:153 Tuesday 24 September 2024 14:56:54 -0400 (0:00:00.674) 0:00:37.997 ***** 41016 1727204214.32163: entering _queue_task() for managed-node1/include_tasks 41016 1727204214.32915: worker is 1 (out of 1 available) 41016 1727204214.32928: exiting _queue_task() for managed-node1/include_tasks 41016 1727204214.32939: done queuing things up, now waiting for results queue to drain 41016 1727204214.32941: waiting for pending results... 41016 1727204214.33470: running TaskExecutor() for managed-node1/TASK: Assert interface1 is absent 41016 1727204214.33807: in run() - task 028d2410-947f-12d5-0ec4-0000000000b6 41016 1727204214.33829: variable 'ansible_search_path' from source: unknown 41016 1727204214.33871: calling self._execute() 41016 1727204214.34092: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204214.34111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204214.34193: variable 'omit' from source: magic vars 41016 1727204214.34932: variable 'ansible_distribution_major_version' from source: facts 41016 1727204214.34995: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204214.35094: _execute() done 41016 1727204214.35106: dumping result to json 41016 1727204214.35116: done dumping result, returning 41016 1727204214.35126: done running TaskExecutor() for managed-node1/TASK: Assert interface1 is absent [028d2410-947f-12d5-0ec4-0000000000b6] 41016 1727204214.35137: sending task result for task 028d2410-947f-12d5-0ec4-0000000000b6 41016 1727204214.35265: no more pending results, returning what we have 41016 1727204214.35272: in VariableManager get_vars() 41016 1727204214.35325: Calling all_inventory to load vars for managed-node1 41016 1727204214.35329: Calling groups_inventory to load vars for managed-node1 41016 1727204214.35332: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204214.35345: Calling all_plugins_play to load vars for managed-node1 41016 1727204214.35349: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204214.35353: Calling groups_plugins_play to load vars for managed-node1 41016 1727204214.36383: done sending task result for task 028d2410-947f-12d5-0ec4-0000000000b6 41016 1727204214.36387: WORKER PROCESS EXITING 41016 1727204214.38390: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204214.41635: done with get_vars() 41016 1727204214.41661: variable 'ansible_search_path' from source: unknown 41016 1727204214.42083: we have included files to process 41016 1727204214.42085: generating all_blocks data 41016 1727204214.42087: done generating all_blocks data 41016 1727204214.42093: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 41016 1727204214.42094: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 41016 1727204214.42097: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 41016 1727204214.42267: in VariableManager get_vars() 41016 1727204214.42296: done with get_vars() 41016 1727204214.42612: done processing included file 41016 1727204214.42614: iterating over new_blocks loaded from include file 41016 1727204214.42616: in VariableManager get_vars() 41016 1727204214.42634: done with get_vars() 41016 1727204214.42636: filtering new block on tags 41016 1727204214.42670: done filtering new block on tags 41016 1727204214.42672: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed-node1 41016 1727204214.42681: extending task lists for all hosts with included blocks 41016 1727204214.45420: done extending task lists 41016 1727204214.45422: done processing included files 41016 1727204214.45423: results queue empty 41016 1727204214.45423: checking for any_errors_fatal 41016 1727204214.45430: done checking for any_errors_fatal 41016 1727204214.45430: checking for max_fail_percentage 41016 1727204214.45432: done checking for max_fail_percentage 41016 1727204214.45433: checking to see if all hosts have failed and the running result is not ok 41016 1727204214.45434: done checking to see if all hosts have failed 41016 1727204214.45434: getting the remaining hosts for this loop 41016 1727204214.45435: done getting the remaining hosts for this loop 41016 1727204214.45438: getting the next task for host managed-node1 41016 1727204214.45443: done getting next task for host managed-node1 41016 1727204214.45445: ^ task is: TASK: Include the task 'get_interface_stat.yml' 41016 1727204214.45448: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204214.45451: getting variables 41016 1727204214.45452: in VariableManager get_vars() 41016 1727204214.45470: Calling all_inventory to load vars for managed-node1 41016 1727204214.45472: Calling groups_inventory to load vars for managed-node1 41016 1727204214.45474: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204214.45483: Calling all_plugins_play to load vars for managed-node1 41016 1727204214.45486: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204214.45489: Calling groups_plugins_play to load vars for managed-node1 41016 1727204214.48302: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204214.51619: done with get_vars() 41016 1727204214.51650: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Tuesday 24 September 2024 14:56:54 -0400 (0:00:00.195) 0:00:38.193 ***** 41016 1727204214.51737: entering _queue_task() for managed-node1/include_tasks 41016 1727204214.52606: worker is 1 (out of 1 available) 41016 1727204214.52619: exiting _queue_task() for managed-node1/include_tasks 41016 1727204214.52632: done queuing things up, now waiting for results queue to drain 41016 1727204214.52633: waiting for pending results... 41016 1727204214.53225: running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' 41016 1727204214.53449: in run() - task 028d2410-947f-12d5-0ec4-000000000816 41016 1727204214.53462: variable 'ansible_search_path' from source: unknown 41016 1727204214.53467: variable 'ansible_search_path' from source: unknown 41016 1727204214.53592: calling self._execute() 41016 1727204214.53839: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204214.53843: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204214.53851: variable 'omit' from source: magic vars 41016 1727204214.54781: variable 'ansible_distribution_major_version' from source: facts 41016 1727204214.54785: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204214.54787: _execute() done 41016 1727204214.54791: dumping result to json 41016 1727204214.54794: done dumping result, returning 41016 1727204214.54854: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' [028d2410-947f-12d5-0ec4-000000000816] 41016 1727204214.54858: sending task result for task 028d2410-947f-12d5-0ec4-000000000816 41016 1727204214.54971: done sending task result for task 028d2410-947f-12d5-0ec4-000000000816 41016 1727204214.54974: WORKER PROCESS EXITING 41016 1727204214.55013: no more pending results, returning what we have 41016 1727204214.55022: in VariableManager get_vars() 41016 1727204214.55074: Calling all_inventory to load vars for managed-node1 41016 1727204214.55080: Calling groups_inventory to load vars for managed-node1 41016 1727204214.55082: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204214.55095: Calling all_plugins_play to load vars for managed-node1 41016 1727204214.55098: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204214.55101: Calling groups_plugins_play to load vars for managed-node1 41016 1727204214.58200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204214.60429: done with get_vars() 41016 1727204214.60456: variable 'ansible_search_path' from source: unknown 41016 1727204214.60457: variable 'ansible_search_path' from source: unknown 41016 1727204214.60497: we have included files to process 41016 1727204214.60498: generating all_blocks data 41016 1727204214.60500: done generating all_blocks data 41016 1727204214.60501: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 41016 1727204214.60502: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 41016 1727204214.60504: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 41016 1727204214.60699: done processing included file 41016 1727204214.60701: iterating over new_blocks loaded from include file 41016 1727204214.60703: in VariableManager get_vars() 41016 1727204214.60722: done with get_vars() 41016 1727204214.60724: filtering new block on tags 41016 1727204214.60752: done filtering new block on tags 41016 1727204214.60757: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node1 41016 1727204214.60762: extending task lists for all hosts with included blocks 41016 1727204214.60953: done extending task lists 41016 1727204214.60955: done processing included files 41016 1727204214.60956: results queue empty 41016 1727204214.60956: checking for any_errors_fatal 41016 1727204214.60994: done checking for any_errors_fatal 41016 1727204214.60995: checking for max_fail_percentage 41016 1727204214.60996: done checking for max_fail_percentage 41016 1727204214.60997: checking to see if all hosts have failed and the running result is not ok 41016 1727204214.60998: done checking to see if all hosts have failed 41016 1727204214.60999: getting the remaining hosts for this loop 41016 1727204214.61000: done getting the remaining hosts for this loop 41016 1727204214.61003: getting the next task for host managed-node1 41016 1727204214.61007: done getting next task for host managed-node1 41016 1727204214.61010: ^ task is: TASK: Get stat for interface {{ interface }} 41016 1727204214.61013: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204214.61016: getting variables 41016 1727204214.61017: in VariableManager get_vars() 41016 1727204214.61030: Calling all_inventory to load vars for managed-node1 41016 1727204214.61032: Calling groups_inventory to load vars for managed-node1 41016 1727204214.61034: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204214.61040: Calling all_plugins_play to load vars for managed-node1 41016 1727204214.61042: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204214.61044: Calling groups_plugins_play to load vars for managed-node1 41016 1727204214.63117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204214.64929: done with get_vars() 41016 1727204214.64998: done getting variables 41016 1727204214.65320: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest1] ***************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:56:54 -0400 (0:00:00.136) 0:00:38.329 ***** 41016 1727204214.65353: entering _queue_task() for managed-node1/stat 41016 1727204214.66213: worker is 1 (out of 1 available) 41016 1727204214.66227: exiting _queue_task() for managed-node1/stat 41016 1727204214.66242: done queuing things up, now waiting for results queue to drain 41016 1727204214.66243: waiting for pending results... 41016 1727204214.67042: running TaskExecutor() for managed-node1/TASK: Get stat for interface ethtest1 41016 1727204214.67574: in run() - task 028d2410-947f-12d5-0ec4-0000000008bc 41016 1727204214.67581: variable 'ansible_search_path' from source: unknown 41016 1727204214.67584: variable 'ansible_search_path' from source: unknown 41016 1727204214.67886: calling self._execute() 41016 1727204214.68134: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204214.68140: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204214.68151: variable 'omit' from source: magic vars 41016 1727204214.68972: variable 'ansible_distribution_major_version' from source: facts 41016 1727204214.69098: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204214.69104: variable 'omit' from source: magic vars 41016 1727204214.69160: variable 'omit' from source: magic vars 41016 1727204214.69413: variable 'interface' from source: set_fact 41016 1727204214.69425: variable 'omit' from source: magic vars 41016 1727204214.69467: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204214.69503: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204214.69642: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204214.69659: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204214.69737: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204214.69762: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204214.69766: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204214.69769: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204214.70112: Set connection var ansible_shell_executable to /bin/sh 41016 1727204214.70122: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204214.70179: Set connection var ansible_shell_type to sh 41016 1727204214.70184: Set connection var ansible_timeout to 10 41016 1727204214.70186: Set connection var ansible_pipelining to False 41016 1727204214.70189: Set connection var ansible_connection to ssh 41016 1727204214.70583: variable 'ansible_shell_executable' from source: unknown 41016 1727204214.70586: variable 'ansible_connection' from source: unknown 41016 1727204214.70589: variable 'ansible_module_compression' from source: unknown 41016 1727204214.70591: variable 'ansible_shell_type' from source: unknown 41016 1727204214.70593: variable 'ansible_shell_executable' from source: unknown 41016 1727204214.70595: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204214.70597: variable 'ansible_pipelining' from source: unknown 41016 1727204214.70598: variable 'ansible_timeout' from source: unknown 41016 1727204214.70600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204214.70763: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 41016 1727204214.70773: variable 'omit' from source: magic vars 41016 1727204214.70781: starting attempt loop 41016 1727204214.70784: running the handler 41016 1727204214.70800: _low_level_execute_command(): starting 41016 1727204214.70805: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204214.72344: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204214.72363: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204214.72591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204214.72827: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204214.73079: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204214.74898: stdout chunk (state=3): >>>/root <<< 41016 1727204214.75002: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204214.75049: stderr chunk (state=3): >>><<< 41016 1727204214.75098: stdout chunk (state=3): >>><<< 41016 1727204214.75130: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204214.75145: _low_level_execute_command(): starting 41016 1727204214.75154: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204214.7513049-43527-150892515361764 `" && echo ansible-tmp-1727204214.7513049-43527-150892515361764="` echo /root/.ansible/tmp/ansible-tmp-1727204214.7513049-43527-150892515361764 `" ) && sleep 0' 41016 1727204214.76552: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204214.76556: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204214.76559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204214.76562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204214.76567: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204214.76570: stderr chunk (state=3): >>>debug2: match not found <<< 41016 1727204214.76587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204214.76590: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41016 1727204214.76592: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 41016 1727204214.76594: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41016 1727204214.76596: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204214.76693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204214.76697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204214.76700: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204214.76802: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204214.76836: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204214.76851: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204214.76962: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204214.79140: stdout chunk (state=3): >>>ansible-tmp-1727204214.7513049-43527-150892515361764=/root/.ansible/tmp/ansible-tmp-1727204214.7513049-43527-150892515361764 <<< 41016 1727204214.79258: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204214.79358: stderr chunk (state=3): >>><<< 41016 1727204214.79361: stdout chunk (state=3): >>><<< 41016 1727204214.79458: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204214.7513049-43527-150892515361764=/root/.ansible/tmp/ansible-tmp-1727204214.7513049-43527-150892515361764 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204214.79509: variable 'ansible_module_compression' from source: unknown 41016 1727204214.79717: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-410168h8uvyln/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 41016 1727204214.79754: variable 'ansible_facts' from source: unknown 41016 1727204214.79870: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204214.7513049-43527-150892515361764/AnsiballZ_stat.py 41016 1727204214.80281: Sending initial data 41016 1727204214.80284: Sent initial data (153 bytes) 41016 1727204214.81481: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204214.81485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 41016 1727204214.81487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204214.81489: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204214.81494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204214.81650: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204214.81835: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204214.83612: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204214.83685: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204214.83754: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmpk9__oky1 /root/.ansible/tmp/ansible-tmp-1727204214.7513049-43527-150892515361764/AnsiballZ_stat.py <<< 41016 1727204214.83758: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204214.7513049-43527-150892515361764/AnsiballZ_stat.py" <<< 41016 1727204214.83846: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmpk9__oky1" to remote "/root/.ansible/tmp/ansible-tmp-1727204214.7513049-43527-150892515361764/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204214.7513049-43527-150892515361764/AnsiballZ_stat.py" <<< 41016 1727204214.85410: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204214.85438: stderr chunk (state=3): >>><<< 41016 1727204214.85441: stdout chunk (state=3): >>><<< 41016 1727204214.85527: done transferring module to remote 41016 1727204214.85530: _low_level_execute_command(): starting 41016 1727204214.85533: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204214.7513049-43527-150892515361764/ /root/.ansible/tmp/ansible-tmp-1727204214.7513049-43527-150892515361764/AnsiballZ_stat.py && sleep 0' 41016 1727204214.86725: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204214.86735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204214.86955: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204214.86992: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204214.86998: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204214.87043: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204214.87211: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204214.89134: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204214.89253: stderr chunk (state=3): >>><<< 41016 1727204214.89256: stdout chunk (state=3): >>><<< 41016 1727204214.89280: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204214.89283: _low_level_execute_command(): starting 41016 1727204214.89350: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204214.7513049-43527-150892515361764/AnsiballZ_stat.py && sleep 0' 41016 1727204214.90463: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204214.90467: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204214.90691: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204214.90881: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204215.07447: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 41016 1727204215.08995: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 41016 1727204215.09006: stdout chunk (state=3): >>><<< 41016 1727204215.09019: stderr chunk (state=3): >>><<< 41016 1727204215.09156: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204215.09161: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204214.7513049-43527-150892515361764/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204215.09163: _low_level_execute_command(): starting 41016 1727204215.09165: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204214.7513049-43527-150892515361764/ > /dev/null 2>&1 && sleep 0' 41016 1727204215.09651: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204215.09667: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204215.09685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204215.09792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204215.09805: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204215.09913: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204215.12030: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204215.12043: stdout chunk (state=3): >>><<< 41016 1727204215.12057: stderr chunk (state=3): >>><<< 41016 1727204215.12081: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204215.12094: handler run complete 41016 1727204215.12128: attempt loop complete, returning result 41016 1727204215.12141: _execute() done 41016 1727204215.12150: dumping result to json 41016 1727204215.12160: done dumping result, returning 41016 1727204215.12172: done running TaskExecutor() for managed-node1/TASK: Get stat for interface ethtest1 [028d2410-947f-12d5-0ec4-0000000008bc] 41016 1727204215.12182: sending task result for task 028d2410-947f-12d5-0ec4-0000000008bc 41016 1727204215.12401: done sending task result for task 028d2410-947f-12d5-0ec4-0000000008bc 41016 1727204215.12404: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 41016 1727204215.12469: no more pending results, returning what we have 41016 1727204215.12474: results queue empty 41016 1727204215.12477: checking for any_errors_fatal 41016 1727204215.12479: done checking for any_errors_fatal 41016 1727204215.12479: checking for max_fail_percentage 41016 1727204215.12481: done checking for max_fail_percentage 41016 1727204215.12482: checking to see if all hosts have failed and the running result is not ok 41016 1727204215.12483: done checking to see if all hosts have failed 41016 1727204215.12484: getting the remaining hosts for this loop 41016 1727204215.12486: done getting the remaining hosts for this loop 41016 1727204215.12490: getting the next task for host managed-node1 41016 1727204215.12499: done getting next task for host managed-node1 41016 1727204215.12503: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 41016 1727204215.12508: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204215.12514: getting variables 41016 1727204215.12516: in VariableManager get_vars() 41016 1727204215.12565: Calling all_inventory to load vars for managed-node1 41016 1727204215.12568: Calling groups_inventory to load vars for managed-node1 41016 1727204215.12571: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204215.12699: Calling all_plugins_play to load vars for managed-node1 41016 1727204215.12704: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204215.12708: Calling groups_plugins_play to load vars for managed-node1 41016 1727204215.14932: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204215.16942: done with get_vars() 41016 1727204215.16971: done getting variables 41016 1727204215.17056: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41016 1727204215.17373: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'ethtest1'] ************************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Tuesday 24 September 2024 14:56:55 -0400 (0:00:00.520) 0:00:38.850 ***** 41016 1727204215.17407: entering _queue_task() for managed-node1/assert 41016 1727204215.18091: worker is 1 (out of 1 available) 41016 1727204215.18100: exiting _queue_task() for managed-node1/assert 41016 1727204215.18111: done queuing things up, now waiting for results queue to drain 41016 1727204215.18112: waiting for pending results... 41016 1727204215.18354: running TaskExecutor() for managed-node1/TASK: Assert that the interface is absent - 'ethtest1' 41016 1727204215.18528: in run() - task 028d2410-947f-12d5-0ec4-000000000817 41016 1727204215.18533: variable 'ansible_search_path' from source: unknown 41016 1727204215.18536: variable 'ansible_search_path' from source: unknown 41016 1727204215.18557: calling self._execute() 41016 1727204215.18668: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204215.18684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204215.18699: variable 'omit' from source: magic vars 41016 1727204215.19107: variable 'ansible_distribution_major_version' from source: facts 41016 1727204215.19182: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204215.19185: variable 'omit' from source: magic vars 41016 1727204215.19194: variable 'omit' from source: magic vars 41016 1727204215.19308: variable 'interface' from source: set_fact 41016 1727204215.19338: variable 'omit' from source: magic vars 41016 1727204215.19390: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204215.19443: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204215.19472: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204215.19498: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204215.19535: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204215.19562: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204215.19618: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204215.19621: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204215.19703: Set connection var ansible_shell_executable to /bin/sh 41016 1727204215.19716: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204215.19733: Set connection var ansible_shell_type to sh 41016 1727204215.19743: Set connection var ansible_timeout to 10 41016 1727204215.19757: Set connection var ansible_pipelining to False 41016 1727204215.19769: Set connection var ansible_connection to ssh 41016 1727204215.19799: variable 'ansible_shell_executable' from source: unknown 41016 1727204215.19834: variable 'ansible_connection' from source: unknown 41016 1727204215.19838: variable 'ansible_module_compression' from source: unknown 41016 1727204215.19841: variable 'ansible_shell_type' from source: unknown 41016 1727204215.19843: variable 'ansible_shell_executable' from source: unknown 41016 1727204215.19846: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204215.19848: variable 'ansible_pipelining' from source: unknown 41016 1727204215.19850: variable 'ansible_timeout' from source: unknown 41016 1727204215.19852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204215.20084: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204215.20088: variable 'omit' from source: magic vars 41016 1727204215.20091: starting attempt loop 41016 1727204215.20093: running the handler 41016 1727204215.20233: variable 'interface_stat' from source: set_fact 41016 1727204215.20289: Evaluated conditional (not interface_stat.stat.exists): True 41016 1727204215.20304: handler run complete 41016 1727204215.20357: attempt loop complete, returning result 41016 1727204215.20364: _execute() done 41016 1727204215.20370: dumping result to json 41016 1727204215.20387: done dumping result, returning 41016 1727204215.20401: done running TaskExecutor() for managed-node1/TASK: Assert that the interface is absent - 'ethtest1' [028d2410-947f-12d5-0ec4-000000000817] 41016 1727204215.20411: sending task result for task 028d2410-947f-12d5-0ec4-000000000817 ok: [managed-node1] => { "changed": false } MSG: All assertions passed 41016 1727204215.20694: no more pending results, returning what we have 41016 1727204215.20698: results queue empty 41016 1727204215.20699: checking for any_errors_fatal 41016 1727204215.20711: done checking for any_errors_fatal 41016 1727204215.20712: checking for max_fail_percentage 41016 1727204215.20714: done checking for max_fail_percentage 41016 1727204215.20715: checking to see if all hosts have failed and the running result is not ok 41016 1727204215.20716: done checking to see if all hosts have failed 41016 1727204215.20716: getting the remaining hosts for this loop 41016 1727204215.20718: done getting the remaining hosts for this loop 41016 1727204215.20722: getting the next task for host managed-node1 41016 1727204215.20736: done getting next task for host managed-node1 41016 1727204215.20740: ^ task is: TASK: Set interface0 41016 1727204215.20743: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204215.20874: getting variables 41016 1727204215.20888: in VariableManager get_vars() 41016 1727204215.21028: Calling all_inventory to load vars for managed-node1 41016 1727204215.21031: Calling groups_inventory to load vars for managed-node1 41016 1727204215.21034: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204215.21045: Calling all_plugins_play to load vars for managed-node1 41016 1727204215.21048: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204215.21051: Calling groups_plugins_play to load vars for managed-node1 41016 1727204215.21705: done sending task result for task 028d2410-947f-12d5-0ec4-000000000817 41016 1727204215.21708: WORKER PROCESS EXITING 41016 1727204215.23935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204215.25875: done with get_vars() 41016 1727204215.25905: done getting variables 41016 1727204215.25971: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set interface0] ********************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:155 Tuesday 24 September 2024 14:56:55 -0400 (0:00:00.085) 0:00:38.936 ***** 41016 1727204215.26005: entering _queue_task() for managed-node1/set_fact 41016 1727204215.26414: worker is 1 (out of 1 available) 41016 1727204215.26427: exiting _queue_task() for managed-node1/set_fact 41016 1727204215.26440: done queuing things up, now waiting for results queue to drain 41016 1727204215.26441: waiting for pending results... 41016 1727204215.26738: running TaskExecutor() for managed-node1/TASK: Set interface0 41016 1727204215.26925: in run() - task 028d2410-947f-12d5-0ec4-0000000000b7 41016 1727204215.26953: variable 'ansible_search_path' from source: unknown 41016 1727204215.27001: calling self._execute() 41016 1727204215.27146: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204215.27160: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204215.27178: variable 'omit' from source: magic vars 41016 1727204215.27702: variable 'ansible_distribution_major_version' from source: facts 41016 1727204215.27725: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204215.27738: variable 'omit' from source: magic vars 41016 1727204215.27893: variable 'omit' from source: magic vars 41016 1727204215.27897: variable 'interface0' from source: play vars 41016 1727204215.27923: variable 'interface0' from source: play vars 41016 1727204215.27948: variable 'omit' from source: magic vars 41016 1727204215.28012: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204215.28105: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204215.28197: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204215.28224: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204215.28291: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204215.28353: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204215.28369: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204215.28438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204215.28572: Set connection var ansible_shell_executable to /bin/sh 41016 1727204215.28628: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204215.28697: Set connection var ansible_shell_type to sh 41016 1727204215.28707: Set connection var ansible_timeout to 10 41016 1727204215.28737: Set connection var ansible_pipelining to False 41016 1727204215.28749: Set connection var ansible_connection to ssh 41016 1727204215.28854: variable 'ansible_shell_executable' from source: unknown 41016 1727204215.28861: variable 'ansible_connection' from source: unknown 41016 1727204215.28870: variable 'ansible_module_compression' from source: unknown 41016 1727204215.28873: variable 'ansible_shell_type' from source: unknown 41016 1727204215.28903: variable 'ansible_shell_executable' from source: unknown 41016 1727204215.28914: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204215.28917: variable 'ansible_pipelining' from source: unknown 41016 1727204215.28920: variable 'ansible_timeout' from source: unknown 41016 1727204215.28922: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204215.29200: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204215.29203: variable 'omit' from source: magic vars 41016 1727204215.29208: starting attempt loop 41016 1727204215.29215: running the handler 41016 1727204215.29333: handler run complete 41016 1727204215.29336: attempt loop complete, returning result 41016 1727204215.29342: _execute() done 41016 1727204215.29347: dumping result to json 41016 1727204215.29352: done dumping result, returning 41016 1727204215.29355: done running TaskExecutor() for managed-node1/TASK: Set interface0 [028d2410-947f-12d5-0ec4-0000000000b7] 41016 1727204215.29357: sending task result for task 028d2410-947f-12d5-0ec4-0000000000b7 ok: [managed-node1] => { "ansible_facts": { "interface": "ethtest0" }, "changed": false } 41016 1727204215.29534: no more pending results, returning what we have 41016 1727204215.29538: results queue empty 41016 1727204215.29540: checking for any_errors_fatal 41016 1727204215.29554: done checking for any_errors_fatal 41016 1727204215.29556: checking for max_fail_percentage 41016 1727204215.29558: done checking for max_fail_percentage 41016 1727204215.29559: checking to see if all hosts have failed and the running result is not ok 41016 1727204215.29560: done checking to see if all hosts have failed 41016 1727204215.29560: getting the remaining hosts for this loop 41016 1727204215.29562: done getting the remaining hosts for this loop 41016 1727204215.29569: getting the next task for host managed-node1 41016 1727204215.29583: done getting next task for host managed-node1 41016 1727204215.29587: ^ task is: TASK: Delete interface0 41016 1727204215.29591: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204215.29596: getting variables 41016 1727204215.29598: in VariableManager get_vars() 41016 1727204215.29658: Calling all_inventory to load vars for managed-node1 41016 1727204215.29661: Calling groups_inventory to load vars for managed-node1 41016 1727204215.29664: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204215.29913: Calling all_plugins_play to load vars for managed-node1 41016 1727204215.29918: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204215.29923: done sending task result for task 028d2410-947f-12d5-0ec4-0000000000b7 41016 1727204215.29926: WORKER PROCESS EXITING 41016 1727204215.29930: Calling groups_plugins_play to load vars for managed-node1 41016 1727204215.31815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204215.33746: done with get_vars() 41016 1727204215.33781: done getting variables TASK [Delete interface0] ******************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:158 Tuesday 24 September 2024 14:56:55 -0400 (0:00:00.079) 0:00:39.015 ***** 41016 1727204215.33965: entering _queue_task() for managed-node1/include_tasks 41016 1727204215.34582: worker is 1 (out of 1 available) 41016 1727204215.34592: exiting _queue_task() for managed-node1/include_tasks 41016 1727204215.34608: done queuing things up, now waiting for results queue to drain 41016 1727204215.34609: waiting for pending results... 41016 1727204215.34882: running TaskExecutor() for managed-node1/TASK: Delete interface0 41016 1727204215.35030: in run() - task 028d2410-947f-12d5-0ec4-0000000000b8 41016 1727204215.35083: variable 'ansible_search_path' from source: unknown 41016 1727204215.35156: calling self._execute() 41016 1727204215.35397: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204215.35401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204215.35404: variable 'omit' from source: magic vars 41016 1727204215.35990: variable 'ansible_distribution_major_version' from source: facts 41016 1727204215.36012: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204215.36024: _execute() done 41016 1727204215.36031: dumping result to json 41016 1727204215.36045: done dumping result, returning 41016 1727204215.36071: done running TaskExecutor() for managed-node1/TASK: Delete interface0 [028d2410-947f-12d5-0ec4-0000000000b8] 41016 1727204215.36121: sending task result for task 028d2410-947f-12d5-0ec4-0000000000b8 41016 1727204215.36496: done sending task result for task 028d2410-947f-12d5-0ec4-0000000000b8 41016 1727204215.36500: WORKER PROCESS EXITING 41016 1727204215.36557: no more pending results, returning what we have 41016 1727204215.36563: in VariableManager get_vars() 41016 1727204215.36623: Calling all_inventory to load vars for managed-node1 41016 1727204215.36627: Calling groups_inventory to load vars for managed-node1 41016 1727204215.36630: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204215.36658: Calling all_plugins_play to load vars for managed-node1 41016 1727204215.36662: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204215.36669: Calling groups_plugins_play to load vars for managed-node1 41016 1727204215.39096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204215.41071: done with get_vars() 41016 1727204215.41094: variable 'ansible_search_path' from source: unknown 41016 1727204215.41127: we have included files to process 41016 1727204215.41128: generating all_blocks data 41016 1727204215.41130: done generating all_blocks data 41016 1727204215.41137: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 41016 1727204215.41139: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 41016 1727204215.41141: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 41016 1727204215.41564: done processing included file 41016 1727204215.41566: iterating over new_blocks loaded from include file 41016 1727204215.41568: in VariableManager get_vars() 41016 1727204215.41593: done with get_vars() 41016 1727204215.41612: filtering new block on tags 41016 1727204215.41638: done filtering new block on tags 41016 1727204215.41641: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed-node1 41016 1727204215.41654: extending task lists for all hosts with included blocks 41016 1727204215.44985: done extending task lists 41016 1727204215.44986: done processing included files 41016 1727204215.44987: results queue empty 41016 1727204215.44988: checking for any_errors_fatal 41016 1727204215.44991: done checking for any_errors_fatal 41016 1727204215.44992: checking for max_fail_percentage 41016 1727204215.44993: done checking for max_fail_percentage 41016 1727204215.44994: checking to see if all hosts have failed and the running result is not ok 41016 1727204215.44995: done checking to see if all hosts have failed 41016 1727204215.44995: getting the remaining hosts for this loop 41016 1727204215.44997: done getting the remaining hosts for this loop 41016 1727204215.44999: getting the next task for host managed-node1 41016 1727204215.45004: done getting next task for host managed-node1 41016 1727204215.45006: ^ task is: TASK: Remove test interface if necessary 41016 1727204215.45009: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204215.45012: getting variables 41016 1727204215.45013: in VariableManager get_vars() 41016 1727204215.45031: Calling all_inventory to load vars for managed-node1 41016 1727204215.45034: Calling groups_inventory to load vars for managed-node1 41016 1727204215.45036: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204215.45043: Calling all_plugins_play to load vars for managed-node1 41016 1727204215.45045: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204215.45048: Calling groups_plugins_play to load vars for managed-node1 41016 1727204215.47353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204215.51226: done with get_vars() 41016 1727204215.51559: done getting variables 41016 1727204215.51610: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Tuesday 24 September 2024 14:56:55 -0400 (0:00:00.176) 0:00:39.192 ***** 41016 1727204215.51642: entering _queue_task() for managed-node1/command 41016 1727204215.52811: worker is 1 (out of 1 available) 41016 1727204215.52821: exiting _queue_task() for managed-node1/command 41016 1727204215.52830: done queuing things up, now waiting for results queue to drain 41016 1727204215.52832: waiting for pending results... 41016 1727204215.53396: running TaskExecutor() for managed-node1/TASK: Remove test interface if necessary 41016 1727204215.53505: in run() - task 028d2410-947f-12d5-0ec4-0000000008da 41016 1727204215.53526: variable 'ansible_search_path' from source: unknown 41016 1727204215.53682: variable 'ansible_search_path' from source: unknown 41016 1727204215.53687: calling self._execute() 41016 1727204215.53849: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204215.53860: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204215.53874: variable 'omit' from source: magic vars 41016 1727204215.54647: variable 'ansible_distribution_major_version' from source: facts 41016 1727204215.54686: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204215.54982: variable 'omit' from source: magic vars 41016 1727204215.54986: variable 'omit' from source: magic vars 41016 1727204215.55035: variable 'interface' from source: set_fact 41016 1727204215.55059: variable 'omit' from source: magic vars 41016 1727204215.55219: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204215.55260: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204215.55286: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204215.55310: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204215.55326: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204215.55411: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204215.55422: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204215.55430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204215.55645: Set connection var ansible_shell_executable to /bin/sh 41016 1727204215.55655: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204215.55664: Set connection var ansible_shell_type to sh 41016 1727204215.55852: Set connection var ansible_timeout to 10 41016 1727204215.55855: Set connection var ansible_pipelining to False 41016 1727204215.55857: Set connection var ansible_connection to ssh 41016 1727204215.55859: variable 'ansible_shell_executable' from source: unknown 41016 1727204215.55862: variable 'ansible_connection' from source: unknown 41016 1727204215.55864: variable 'ansible_module_compression' from source: unknown 41016 1727204215.55866: variable 'ansible_shell_type' from source: unknown 41016 1727204215.55869: variable 'ansible_shell_executable' from source: unknown 41016 1727204215.55870: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204215.55872: variable 'ansible_pipelining' from source: unknown 41016 1727204215.55874: variable 'ansible_timeout' from source: unknown 41016 1727204215.55878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204215.56082: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204215.56192: variable 'omit' from source: magic vars 41016 1727204215.56198: starting attempt loop 41016 1727204215.56201: running the handler 41016 1727204215.56220: _low_level_execute_command(): starting 41016 1727204215.56223: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204215.57704: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204215.57761: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204215.57765: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204215.57896: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204215.57981: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204215.59905: stdout chunk (state=3): >>>/root <<< 41016 1727204215.60043: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204215.60046: stderr chunk (state=3): >>><<< 41016 1727204215.60052: stdout chunk (state=3): >>><<< 41016 1727204215.60196: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204215.60215: _low_level_execute_command(): starting 41016 1727204215.60219: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204215.6019735-43556-154879507172289 `" && echo ansible-tmp-1727204215.6019735-43556-154879507172289="` echo /root/.ansible/tmp/ansible-tmp-1727204215.6019735-43556-154879507172289 `" ) && sleep 0' 41016 1727204215.61766: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204215.61770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204215.61772: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204215.61775: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204215.61789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 41016 1727204215.61792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204215.61795: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204215.61929: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204215.62022: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204215.64125: stdout chunk (state=3): >>>ansible-tmp-1727204215.6019735-43556-154879507172289=/root/.ansible/tmp/ansible-tmp-1727204215.6019735-43556-154879507172289 <<< 41016 1727204215.64356: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204215.64360: stderr chunk (state=3): >>><<< 41016 1727204215.64362: stdout chunk (state=3): >>><<< 41016 1727204215.64391: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204215.6019735-43556-154879507172289=/root/.ansible/tmp/ansible-tmp-1727204215.6019735-43556-154879507172289 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204215.64423: variable 'ansible_module_compression' from source: unknown 41016 1727204215.64477: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-410168h8uvyln/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41016 1727204215.64515: variable 'ansible_facts' from source: unknown 41016 1727204215.64813: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204215.6019735-43556-154879507172289/AnsiballZ_command.py 41016 1727204215.65173: Sending initial data 41016 1727204215.65178: Sent initial data (156 bytes) 41016 1727204215.66323: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204215.66351: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204215.66545: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204215.66555: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204215.66674: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204215.68616: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204215.68664: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmp3sfd2bir /root/.ansible/tmp/ansible-tmp-1727204215.6019735-43556-154879507172289/AnsiballZ_command.py <<< 41016 1727204215.68668: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204215.6019735-43556-154879507172289/AnsiballZ_command.py" <<< 41016 1727204215.68734: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmp3sfd2bir" to remote "/root/.ansible/tmp/ansible-tmp-1727204215.6019735-43556-154879507172289/AnsiballZ_command.py" <<< 41016 1727204215.68740: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204215.6019735-43556-154879507172289/AnsiballZ_command.py" <<< 41016 1727204215.70080: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204215.70089: stderr chunk (state=3): >>><<< 41016 1727204215.70092: stdout chunk (state=3): >>><<< 41016 1727204215.70188: done transferring module to remote 41016 1727204215.70199: _low_level_execute_command(): starting 41016 1727204215.70204: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204215.6019735-43556-154879507172289/ /root/.ansible/tmp/ansible-tmp-1727204215.6019735-43556-154879507172289/AnsiballZ_command.py && sleep 0' 41016 1727204215.71735: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 41016 1727204215.71739: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204215.71742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204215.71744: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204215.71747: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204215.71916: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204215.72012: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204215.74185: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204215.74192: stderr chunk (state=3): >>><<< 41016 1727204215.74195: stdout chunk (state=3): >>><<< 41016 1727204215.74215: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204215.74218: _low_level_execute_command(): starting 41016 1727204215.74221: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204215.6019735-43556-154879507172289/AnsiballZ_command.py && sleep 0' 41016 1727204215.75512: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204215.75532: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204215.93342: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-24 14:56:55.921523", "end": "2024-09-24 14:56:55.931568", "delta": "0:00:00.010045", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41016 1727204215.95349: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 41016 1727204215.95353: stderr chunk (state=3): >>><<< 41016 1727204215.95355: stdout chunk (state=3): >>><<< 41016 1727204215.95380: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-24 14:56:55.921523", "end": "2024-09-24 14:56:55.931568", "delta": "0:00:00.010045", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204215.95423: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204215.6019735-43556-154879507172289/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204215.95432: _low_level_execute_command(): starting 41016 1727204215.95443: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204215.6019735-43556-154879507172289/ > /dev/null 2>&1 && sleep 0' 41016 1727204215.96882: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204215.96885: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204215.96930: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204215.97582: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204215.97841: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204215.99891: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204215.99895: stdout chunk (state=3): >>><<< 41016 1727204215.99901: stderr chunk (state=3): >>><<< 41016 1727204215.99926: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204215.99930: handler run complete 41016 1727204215.99956: Evaluated conditional (False): False 41016 1727204215.99965: attempt loop complete, returning result 41016 1727204215.99968: _execute() done 41016 1727204215.99971: dumping result to json 41016 1727204215.99977: done dumping result, returning 41016 1727204216.00139: done running TaskExecutor() for managed-node1/TASK: Remove test interface if necessary [028d2410-947f-12d5-0ec4-0000000008da] 41016 1727204216.00142: sending task result for task 028d2410-947f-12d5-0ec4-0000000008da 41016 1727204216.00254: done sending task result for task 028d2410-947f-12d5-0ec4-0000000008da 41016 1727204216.00259: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": [ "ip", "link", "del", "ethtest0" ], "delta": "0:00:00.010045", "end": "2024-09-24 14:56:55.931568", "rc": 0, "start": "2024-09-24 14:56:55.921523" } 41016 1727204216.00332: no more pending results, returning what we have 41016 1727204216.00337: results queue empty 41016 1727204216.00452: checking for any_errors_fatal 41016 1727204216.00455: done checking for any_errors_fatal 41016 1727204216.00455: checking for max_fail_percentage 41016 1727204216.00457: done checking for max_fail_percentage 41016 1727204216.00458: checking to see if all hosts have failed and the running result is not ok 41016 1727204216.00459: done checking to see if all hosts have failed 41016 1727204216.00460: getting the remaining hosts for this loop 41016 1727204216.00462: done getting the remaining hosts for this loop 41016 1727204216.00465: getting the next task for host managed-node1 41016 1727204216.00478: done getting next task for host managed-node1 41016 1727204216.00482: ^ task is: TASK: Assert interface0 is absent 41016 1727204216.00486: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204216.00491: getting variables 41016 1727204216.00493: in VariableManager get_vars() 41016 1727204216.00534: Calling all_inventory to load vars for managed-node1 41016 1727204216.00537: Calling groups_inventory to load vars for managed-node1 41016 1727204216.00539: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204216.00549: Calling all_plugins_play to load vars for managed-node1 41016 1727204216.00551: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204216.00553: Calling groups_plugins_play to load vars for managed-node1 41016 1727204216.04747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204216.09197: done with get_vars() 41016 1727204216.09230: done getting variables TASK [Assert interface0 is absent] ********************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:160 Tuesday 24 September 2024 14:56:56 -0400 (0:00:00.578) 0:00:39.770 ***** 41016 1727204216.09478: entering _queue_task() for managed-node1/include_tasks 41016 1727204216.10168: worker is 1 (out of 1 available) 41016 1727204216.10396: exiting _queue_task() for managed-node1/include_tasks 41016 1727204216.10412: done queuing things up, now waiting for results queue to drain 41016 1727204216.10413: waiting for pending results... 41016 1727204216.10831: running TaskExecutor() for managed-node1/TASK: Assert interface0 is absent 41016 1727204216.11037: in run() - task 028d2410-947f-12d5-0ec4-0000000000b9 41016 1727204216.11054: variable 'ansible_search_path' from source: unknown 41016 1727204216.11285: calling self._execute() 41016 1727204216.11549: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204216.11553: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204216.11556: variable 'omit' from source: magic vars 41016 1727204216.12285: variable 'ansible_distribution_major_version' from source: facts 41016 1727204216.12295: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204216.12301: _execute() done 41016 1727204216.12305: dumping result to json 41016 1727204216.12310: done dumping result, returning 41016 1727204216.12320: done running TaskExecutor() for managed-node1/TASK: Assert interface0 is absent [028d2410-947f-12d5-0ec4-0000000000b9] 41016 1727204216.12323: sending task result for task 028d2410-947f-12d5-0ec4-0000000000b9 41016 1727204216.12416: done sending task result for task 028d2410-947f-12d5-0ec4-0000000000b9 41016 1727204216.12420: WORKER PROCESS EXITING 41016 1727204216.12450: no more pending results, returning what we have 41016 1727204216.12455: in VariableManager get_vars() 41016 1727204216.12509: Calling all_inventory to load vars for managed-node1 41016 1727204216.12514: Calling groups_inventory to load vars for managed-node1 41016 1727204216.12517: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204216.12530: Calling all_plugins_play to load vars for managed-node1 41016 1727204216.12533: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204216.12536: Calling groups_plugins_play to load vars for managed-node1 41016 1727204216.14678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204216.16301: done with get_vars() 41016 1727204216.16325: variable 'ansible_search_path' from source: unknown 41016 1727204216.16348: we have included files to process 41016 1727204216.16349: generating all_blocks data 41016 1727204216.16351: done generating all_blocks data 41016 1727204216.16355: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 41016 1727204216.16356: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 41016 1727204216.16359: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 41016 1727204216.16480: in VariableManager get_vars() 41016 1727204216.16504: done with get_vars() 41016 1727204216.16624: done processing included file 41016 1727204216.16626: iterating over new_blocks loaded from include file 41016 1727204216.16628: in VariableManager get_vars() 41016 1727204216.16646: done with get_vars() 41016 1727204216.16648: filtering new block on tags 41016 1727204216.16801: done filtering new block on tags 41016 1727204216.16804: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed-node1 41016 1727204216.16809: extending task lists for all hosts with included blocks 41016 1727204216.19165: done extending task lists 41016 1727204216.19167: done processing included files 41016 1727204216.19167: results queue empty 41016 1727204216.19168: checking for any_errors_fatal 41016 1727204216.19171: done checking for any_errors_fatal 41016 1727204216.19172: checking for max_fail_percentage 41016 1727204216.19173: done checking for max_fail_percentage 41016 1727204216.19173: checking to see if all hosts have failed and the running result is not ok 41016 1727204216.19174: done checking to see if all hosts have failed 41016 1727204216.19174: getting the remaining hosts for this loop 41016 1727204216.19177: done getting the remaining hosts for this loop 41016 1727204216.19179: getting the next task for host managed-node1 41016 1727204216.19182: done getting next task for host managed-node1 41016 1727204216.19183: ^ task is: TASK: Include the task 'get_interface_stat.yml' 41016 1727204216.19185: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204216.19187: getting variables 41016 1727204216.19188: in VariableManager get_vars() 41016 1727204216.19204: Calling all_inventory to load vars for managed-node1 41016 1727204216.19206: Calling groups_inventory to load vars for managed-node1 41016 1727204216.19208: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204216.19213: Calling all_plugins_play to load vars for managed-node1 41016 1727204216.19215: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204216.19216: Calling groups_plugins_play to load vars for managed-node1 41016 1727204216.19920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204216.21160: done with get_vars() 41016 1727204216.21189: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Tuesday 24 September 2024 14:56:56 -0400 (0:00:00.117) 0:00:39.888 ***** 41016 1727204216.21271: entering _queue_task() for managed-node1/include_tasks 41016 1727204216.22261: worker is 1 (out of 1 available) 41016 1727204216.22271: exiting _queue_task() for managed-node1/include_tasks 41016 1727204216.22323: done queuing things up, now waiting for results queue to drain 41016 1727204216.22325: waiting for pending results... 41016 1727204216.22706: running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' 41016 1727204216.22882: in run() - task 028d2410-947f-12d5-0ec4-000000000990 41016 1727204216.22887: variable 'ansible_search_path' from source: unknown 41016 1727204216.22889: variable 'ansible_search_path' from source: unknown 41016 1727204216.22931: calling self._execute() 41016 1727204216.23048: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204216.23172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204216.23178: variable 'omit' from source: magic vars 41016 1727204216.23513: variable 'ansible_distribution_major_version' from source: facts 41016 1727204216.23531: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204216.23541: _execute() done 41016 1727204216.23548: dumping result to json 41016 1727204216.23555: done dumping result, returning 41016 1727204216.23565: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' [028d2410-947f-12d5-0ec4-000000000990] 41016 1727204216.23574: sending task result for task 028d2410-947f-12d5-0ec4-000000000990 41016 1727204216.23722: done sending task result for task 028d2410-947f-12d5-0ec4-000000000990 41016 1727204216.23725: WORKER PROCESS EXITING 41016 1727204216.23764: no more pending results, returning what we have 41016 1727204216.23770: in VariableManager get_vars() 41016 1727204216.23821: Calling all_inventory to load vars for managed-node1 41016 1727204216.23824: Calling groups_inventory to load vars for managed-node1 41016 1727204216.23827: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204216.23839: Calling all_plugins_play to load vars for managed-node1 41016 1727204216.23842: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204216.23845: Calling groups_plugins_play to load vars for managed-node1 41016 1727204216.25324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204216.27858: done with get_vars() 41016 1727204216.27938: variable 'ansible_search_path' from source: unknown 41016 1727204216.27940: variable 'ansible_search_path' from source: unknown 41016 1727204216.27981: we have included files to process 41016 1727204216.27982: generating all_blocks data 41016 1727204216.27984: done generating all_blocks data 41016 1727204216.27985: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 41016 1727204216.27986: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 41016 1727204216.27988: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 41016 1727204216.28203: done processing included file 41016 1727204216.28205: iterating over new_blocks loaded from include file 41016 1727204216.28206: in VariableManager get_vars() 41016 1727204216.28232: done with get_vars() 41016 1727204216.28234: filtering new block on tags 41016 1727204216.28261: done filtering new block on tags 41016 1727204216.28264: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node1 41016 1727204216.28270: extending task lists for all hosts with included blocks 41016 1727204216.28451: done extending task lists 41016 1727204216.28452: done processing included files 41016 1727204216.28453: results queue empty 41016 1727204216.28453: checking for any_errors_fatal 41016 1727204216.28457: done checking for any_errors_fatal 41016 1727204216.28458: checking for max_fail_percentage 41016 1727204216.28459: done checking for max_fail_percentage 41016 1727204216.28460: checking to see if all hosts have failed and the running result is not ok 41016 1727204216.28461: done checking to see if all hosts have failed 41016 1727204216.28461: getting the remaining hosts for this loop 41016 1727204216.28462: done getting the remaining hosts for this loop 41016 1727204216.28465: getting the next task for host managed-node1 41016 1727204216.28469: done getting next task for host managed-node1 41016 1727204216.28471: ^ task is: TASK: Get stat for interface {{ interface }} 41016 1727204216.28475: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204216.28479: getting variables 41016 1727204216.28480: in VariableManager get_vars() 41016 1727204216.28497: Calling all_inventory to load vars for managed-node1 41016 1727204216.28500: Calling groups_inventory to load vars for managed-node1 41016 1727204216.28502: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204216.28507: Calling all_plugins_play to load vars for managed-node1 41016 1727204216.28514: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204216.28518: Calling groups_plugins_play to load vars for managed-node1 41016 1727204216.29593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204216.30907: done with get_vars() 41016 1727204216.30931: done getting variables 41016 1727204216.31145: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:56:56 -0400 (0:00:00.099) 0:00:39.987 ***** 41016 1727204216.31179: entering _queue_task() for managed-node1/stat 41016 1727204216.31822: worker is 1 (out of 1 available) 41016 1727204216.31838: exiting _queue_task() for managed-node1/stat 41016 1727204216.31851: done queuing things up, now waiting for results queue to drain 41016 1727204216.31852: waiting for pending results... 41016 1727204216.32393: running TaskExecutor() for managed-node1/TASK: Get stat for interface ethtest0 41016 1727204216.32463: in run() - task 028d2410-947f-12d5-0ec4-000000000a4d 41016 1727204216.32479: variable 'ansible_search_path' from source: unknown 41016 1727204216.32486: variable 'ansible_search_path' from source: unknown 41016 1727204216.32571: calling self._execute() 41016 1727204216.32615: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204216.32619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204216.32627: variable 'omit' from source: magic vars 41016 1727204216.32994: variable 'ansible_distribution_major_version' from source: facts 41016 1727204216.33002: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204216.33009: variable 'omit' from source: magic vars 41016 1727204216.33058: variable 'omit' from source: magic vars 41016 1727204216.33184: variable 'interface' from source: set_fact 41016 1727204216.33188: variable 'omit' from source: magic vars 41016 1727204216.33213: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204216.33248: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204216.33267: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204216.33286: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204216.33319: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204216.33333: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204216.33336: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204216.33382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204216.33462: Set connection var ansible_shell_executable to /bin/sh 41016 1727204216.33477: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204216.33485: Set connection var ansible_shell_type to sh 41016 1727204216.33490: Set connection var ansible_timeout to 10 41016 1727204216.33496: Set connection var ansible_pipelining to False 41016 1727204216.33508: Set connection var ansible_connection to ssh 41016 1727204216.33545: variable 'ansible_shell_executable' from source: unknown 41016 1727204216.33549: variable 'ansible_connection' from source: unknown 41016 1727204216.33552: variable 'ansible_module_compression' from source: unknown 41016 1727204216.33554: variable 'ansible_shell_type' from source: unknown 41016 1727204216.33557: variable 'ansible_shell_executable' from source: unknown 41016 1727204216.33559: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204216.33568: variable 'ansible_pipelining' from source: unknown 41016 1727204216.33571: variable 'ansible_timeout' from source: unknown 41016 1727204216.33573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204216.33738: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 41016 1727204216.33747: variable 'omit' from source: magic vars 41016 1727204216.33753: starting attempt loop 41016 1727204216.33756: running the handler 41016 1727204216.33767: _low_level_execute_command(): starting 41016 1727204216.33774: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204216.34279: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204216.34312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204216.34316: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204216.34319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204216.34362: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204216.34365: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204216.34473: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204216.36270: stdout chunk (state=3): >>>/root <<< 41016 1727204216.36418: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204216.36422: stdout chunk (state=3): >>><<< 41016 1727204216.36424: stderr chunk (state=3): >>><<< 41016 1727204216.36545: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204216.36549: _low_level_execute_command(): starting 41016 1727204216.36552: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204216.3645186-43594-197509623080687 `" && echo ansible-tmp-1727204216.3645186-43594-197509623080687="` echo /root/.ansible/tmp/ansible-tmp-1727204216.3645186-43594-197509623080687 `" ) && sleep 0' 41016 1727204216.37048: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204216.37062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 41016 1727204216.37097: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204216.37142: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204216.37147: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204216.37150: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204216.37229: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204216.39362: stdout chunk (state=3): >>>ansible-tmp-1727204216.3645186-43594-197509623080687=/root/.ansible/tmp/ansible-tmp-1727204216.3645186-43594-197509623080687 <<< 41016 1727204216.39474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204216.39505: stderr chunk (state=3): >>><<< 41016 1727204216.39509: stdout chunk (state=3): >>><<< 41016 1727204216.39528: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204216.3645186-43594-197509623080687=/root/.ansible/tmp/ansible-tmp-1727204216.3645186-43594-197509623080687 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204216.39572: variable 'ansible_module_compression' from source: unknown 41016 1727204216.39618: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-410168h8uvyln/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 41016 1727204216.39652: variable 'ansible_facts' from source: unknown 41016 1727204216.39718: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204216.3645186-43594-197509623080687/AnsiballZ_stat.py 41016 1727204216.39824: Sending initial data 41016 1727204216.39827: Sent initial data (153 bytes) 41016 1727204216.40272: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204216.40322: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204216.40325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204216.40327: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 41016 1727204216.40333: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204216.40335: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204216.40420: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204216.40425: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204216.40535: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204216.42279: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204216.42352: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204216.42439: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmponpgi243 /root/.ansible/tmp/ansible-tmp-1727204216.3645186-43594-197509623080687/AnsiballZ_stat.py <<< 41016 1727204216.42442: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204216.3645186-43594-197509623080687/AnsiballZ_stat.py" <<< 41016 1727204216.42508: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmponpgi243" to remote "/root/.ansible/tmp/ansible-tmp-1727204216.3645186-43594-197509623080687/AnsiballZ_stat.py" <<< 41016 1727204216.42514: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204216.3645186-43594-197509623080687/AnsiballZ_stat.py" <<< 41016 1727204216.43394: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204216.43437: stderr chunk (state=3): >>><<< 41016 1727204216.43440: stdout chunk (state=3): >>><<< 41016 1727204216.43484: done transferring module to remote 41016 1727204216.43501: _low_level_execute_command(): starting 41016 1727204216.43505: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204216.3645186-43594-197509623080687/ /root/.ansible/tmp/ansible-tmp-1727204216.3645186-43594-197509623080687/AnsiballZ_stat.py && sleep 0' 41016 1727204216.44144: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204216.44151: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 41016 1727204216.44153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 41016 1727204216.44155: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204216.44163: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204216.44247: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204216.44253: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204216.44347: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204216.46335: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204216.46363: stderr chunk (state=3): >>><<< 41016 1727204216.46367: stdout chunk (state=3): >>><<< 41016 1727204216.46389: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204216.46392: _low_level_execute_command(): starting 41016 1727204216.46396: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204216.3645186-43594-197509623080687/AnsiballZ_stat.py && sleep 0' 41016 1727204216.47098: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204216.47101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204216.47105: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204216.47107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 41016 1727204216.47109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204216.47190: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204216.47214: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204216.47332: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204216.64018: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 41016 1727204216.65638: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 41016 1727204216.65643: stdout chunk (state=3): >>><<< 41016 1727204216.65645: stderr chunk (state=3): >>><<< 41016 1727204216.65807: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204216.65811: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204216.3645186-43594-197509623080687/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204216.65814: _low_level_execute_command(): starting 41016 1727204216.65816: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204216.3645186-43594-197509623080687/ > /dev/null 2>&1 && sleep 0' 41016 1727204216.67492: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204216.67531: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204216.67555: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204216.67583: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204216.67714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204216.69893: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204216.69897: stdout chunk (state=3): >>><<< 41016 1727204216.69899: stderr chunk (state=3): >>><<< 41016 1727204216.69901: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204216.69904: handler run complete 41016 1727204216.69907: attempt loop complete, returning result 41016 1727204216.69909: _execute() done 41016 1727204216.69910: dumping result to json 41016 1727204216.69912: done dumping result, returning 41016 1727204216.69914: done running TaskExecutor() for managed-node1/TASK: Get stat for interface ethtest0 [028d2410-947f-12d5-0ec4-000000000a4d] 41016 1727204216.69916: sending task result for task 028d2410-947f-12d5-0ec4-000000000a4d 41016 1727204216.70118: done sending task result for task 028d2410-947f-12d5-0ec4-000000000a4d 41016 1727204216.70122: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 41016 1727204216.70317: no more pending results, returning what we have 41016 1727204216.70322: results queue empty 41016 1727204216.70323: checking for any_errors_fatal 41016 1727204216.70325: done checking for any_errors_fatal 41016 1727204216.70325: checking for max_fail_percentage 41016 1727204216.70327: done checking for max_fail_percentage 41016 1727204216.70328: checking to see if all hosts have failed and the running result is not ok 41016 1727204216.70329: done checking to see if all hosts have failed 41016 1727204216.70330: getting the remaining hosts for this loop 41016 1727204216.70332: done getting the remaining hosts for this loop 41016 1727204216.70336: getting the next task for host managed-node1 41016 1727204216.70489: done getting next task for host managed-node1 41016 1727204216.70493: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 41016 1727204216.70497: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204216.70502: getting variables 41016 1727204216.70503: in VariableManager get_vars() 41016 1727204216.70547: Calling all_inventory to load vars for managed-node1 41016 1727204216.70550: Calling groups_inventory to load vars for managed-node1 41016 1727204216.70553: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204216.70564: Calling all_plugins_play to load vars for managed-node1 41016 1727204216.70567: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204216.70571: Calling groups_plugins_play to load vars for managed-node1 41016 1727204216.72976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204216.75563: done with get_vars() 41016 1727204216.75607: done getting variables 41016 1727204216.75668: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41016 1727204216.75834: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'ethtest0'] ************************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Tuesday 24 September 2024 14:56:56 -0400 (0:00:00.446) 0:00:40.434 ***** 41016 1727204216.75866: entering _queue_task() for managed-node1/assert 41016 1727204216.76501: worker is 1 (out of 1 available) 41016 1727204216.76512: exiting _queue_task() for managed-node1/assert 41016 1727204216.76526: done queuing things up, now waiting for results queue to drain 41016 1727204216.76527: waiting for pending results... 41016 1727204216.77100: running TaskExecutor() for managed-node1/TASK: Assert that the interface is absent - 'ethtest0' 41016 1727204216.77159: in run() - task 028d2410-947f-12d5-0ec4-000000000991 41016 1727204216.77177: variable 'ansible_search_path' from source: unknown 41016 1727204216.77180: variable 'ansible_search_path' from source: unknown 41016 1727204216.77184: calling self._execute() 41016 1727204216.77239: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204216.77243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204216.77252: variable 'omit' from source: magic vars 41016 1727204216.77544: variable 'ansible_distribution_major_version' from source: facts 41016 1727204216.77569: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204216.77572: variable 'omit' from source: magic vars 41016 1727204216.77600: variable 'omit' from source: magic vars 41016 1727204216.77677: variable 'interface' from source: set_fact 41016 1727204216.77691: variable 'omit' from source: magic vars 41016 1727204216.77737: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204216.77784: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204216.77788: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204216.77806: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204216.77818: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204216.77866: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204216.77870: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204216.77872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204216.77958: Set connection var ansible_shell_executable to /bin/sh 41016 1727204216.77961: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204216.77965: Set connection var ansible_shell_type to sh 41016 1727204216.77977: Set connection var ansible_timeout to 10 41016 1727204216.78007: Set connection var ansible_pipelining to False 41016 1727204216.78010: Set connection var ansible_connection to ssh 41016 1727204216.78031: variable 'ansible_shell_executable' from source: unknown 41016 1727204216.78034: variable 'ansible_connection' from source: unknown 41016 1727204216.78037: variable 'ansible_module_compression' from source: unknown 41016 1727204216.78039: variable 'ansible_shell_type' from source: unknown 41016 1727204216.78042: variable 'ansible_shell_executable' from source: unknown 41016 1727204216.78044: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204216.78046: variable 'ansible_pipelining' from source: unknown 41016 1727204216.78049: variable 'ansible_timeout' from source: unknown 41016 1727204216.78051: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204216.78163: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204216.78174: variable 'omit' from source: magic vars 41016 1727204216.78179: starting attempt loop 41016 1727204216.78181: running the handler 41016 1727204216.78312: variable 'interface_stat' from source: set_fact 41016 1727204216.78322: Evaluated conditional (not interface_stat.stat.exists): True 41016 1727204216.78327: handler run complete 41016 1727204216.78338: attempt loop complete, returning result 41016 1727204216.78340: _execute() done 41016 1727204216.78343: dumping result to json 41016 1727204216.78347: done dumping result, returning 41016 1727204216.78357: done running TaskExecutor() for managed-node1/TASK: Assert that the interface is absent - 'ethtest0' [028d2410-947f-12d5-0ec4-000000000991] 41016 1727204216.78360: sending task result for task 028d2410-947f-12d5-0ec4-000000000991 41016 1727204216.78440: done sending task result for task 028d2410-947f-12d5-0ec4-000000000991 41016 1727204216.78443: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 41016 1727204216.78499: no more pending results, returning what we have 41016 1727204216.78503: results queue empty 41016 1727204216.78504: checking for any_errors_fatal 41016 1727204216.78520: done checking for any_errors_fatal 41016 1727204216.78521: checking for max_fail_percentage 41016 1727204216.78523: done checking for max_fail_percentage 41016 1727204216.78523: checking to see if all hosts have failed and the running result is not ok 41016 1727204216.78524: done checking to see if all hosts have failed 41016 1727204216.78525: getting the remaining hosts for this loop 41016 1727204216.78526: done getting the remaining hosts for this loop 41016 1727204216.78530: getting the next task for host managed-node1 41016 1727204216.78538: done getting next task for host managed-node1 41016 1727204216.78541: ^ task is: TASK: Assert interface0 profile and interface1 profile are absent 41016 1727204216.78544: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204216.78549: getting variables 41016 1727204216.78551: in VariableManager get_vars() 41016 1727204216.78596: Calling all_inventory to load vars for managed-node1 41016 1727204216.78599: Calling groups_inventory to load vars for managed-node1 41016 1727204216.78602: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204216.78611: Calling all_plugins_play to load vars for managed-node1 41016 1727204216.78614: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204216.78617: Calling groups_plugins_play to load vars for managed-node1 41016 1727204216.79745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204216.81087: done with get_vars() 41016 1727204216.81119: done getting variables TASK [Assert interface0 profile and interface1 profile are absent] ************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:162 Tuesday 24 September 2024 14:56:56 -0400 (0:00:00.053) 0:00:40.488 ***** 41016 1727204216.81255: entering _queue_task() for managed-node1/include_tasks 41016 1727204216.81641: worker is 1 (out of 1 available) 41016 1727204216.81653: exiting _queue_task() for managed-node1/include_tasks 41016 1727204216.81665: done queuing things up, now waiting for results queue to drain 41016 1727204216.81666: waiting for pending results... 41016 1727204216.82226: running TaskExecutor() for managed-node1/TASK: Assert interface0 profile and interface1 profile are absent 41016 1727204216.82289: in run() - task 028d2410-947f-12d5-0ec4-0000000000ba 41016 1727204216.82294: variable 'ansible_search_path' from source: unknown 41016 1727204216.82297: variable 'interface0' from source: play vars 41016 1727204216.82586: variable 'interface0' from source: play vars 41016 1727204216.82590: variable 'interface1' from source: play vars 41016 1727204216.82651: variable 'interface1' from source: play vars 41016 1727204216.82655: variable 'omit' from source: magic vars 41016 1727204216.82765: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204216.82771: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204216.82782: variable 'omit' from source: magic vars 41016 1727204216.82977: variable 'ansible_distribution_major_version' from source: facts 41016 1727204216.82980: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204216.83003: variable 'item' from source: unknown 41016 1727204216.83101: variable 'item' from source: unknown 41016 1727204216.83243: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204216.83246: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204216.83249: variable 'omit' from source: magic vars 41016 1727204216.83341: variable 'ansible_distribution_major_version' from source: facts 41016 1727204216.83344: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204216.83366: variable 'item' from source: unknown 41016 1727204216.83412: variable 'item' from source: unknown 41016 1727204216.83486: dumping result to json 41016 1727204216.83489: done dumping result, returning 41016 1727204216.83491: done running TaskExecutor() for managed-node1/TASK: Assert interface0 profile and interface1 profile are absent [028d2410-947f-12d5-0ec4-0000000000ba] 41016 1727204216.83493: sending task result for task 028d2410-947f-12d5-0ec4-0000000000ba 41016 1727204216.83528: done sending task result for task 028d2410-947f-12d5-0ec4-0000000000ba 41016 1727204216.83531: WORKER PROCESS EXITING 41016 1727204216.83610: no more pending results, returning what we have 41016 1727204216.83617: in VariableManager get_vars() 41016 1727204216.83663: Calling all_inventory to load vars for managed-node1 41016 1727204216.83665: Calling groups_inventory to load vars for managed-node1 41016 1727204216.83668: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204216.83680: Calling all_plugins_play to load vars for managed-node1 41016 1727204216.83683: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204216.83685: Calling groups_plugins_play to load vars for managed-node1 41016 1727204216.85820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204216.86848: done with get_vars() 41016 1727204216.86868: variable 'ansible_search_path' from source: unknown 41016 1727204216.86883: variable 'ansible_search_path' from source: unknown 41016 1727204216.86889: we have included files to process 41016 1727204216.86889: generating all_blocks data 41016 1727204216.86891: done generating all_blocks data 41016 1727204216.86893: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 41016 1727204216.86894: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 41016 1727204216.86896: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 41016 1727204216.87014: in VariableManager get_vars() 41016 1727204216.87032: done with get_vars() 41016 1727204216.87115: done processing included file 41016 1727204216.87116: iterating over new_blocks loaded from include file 41016 1727204216.87117: in VariableManager get_vars() 41016 1727204216.87130: done with get_vars() 41016 1727204216.87131: filtering new block on tags 41016 1727204216.87152: done filtering new block on tags 41016 1727204216.87153: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed-node1 => (item=ethtest0) 41016 1727204216.87157: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 41016 1727204216.87158: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 41016 1727204216.87160: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 41016 1727204216.87214: in VariableManager get_vars() 41016 1727204216.87230: done with get_vars() 41016 1727204216.87287: done processing included file 41016 1727204216.87289: iterating over new_blocks loaded from include file 41016 1727204216.87290: in VariableManager get_vars() 41016 1727204216.87301: done with get_vars() 41016 1727204216.87302: filtering new block on tags 41016 1727204216.87322: done filtering new block on tags 41016 1727204216.87324: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed-node1 => (item=ethtest1) 41016 1727204216.87326: extending task lists for all hosts with included blocks 41016 1727204216.88313: done extending task lists 41016 1727204216.88314: done processing included files 41016 1727204216.88315: results queue empty 41016 1727204216.88316: checking for any_errors_fatal 41016 1727204216.88322: done checking for any_errors_fatal 41016 1727204216.88322: checking for max_fail_percentage 41016 1727204216.88324: done checking for max_fail_percentage 41016 1727204216.88324: checking to see if all hosts have failed and the running result is not ok 41016 1727204216.88325: done checking to see if all hosts have failed 41016 1727204216.88326: getting the remaining hosts for this loop 41016 1727204216.88327: done getting the remaining hosts for this loop 41016 1727204216.88330: getting the next task for host managed-node1 41016 1727204216.88334: done getting next task for host managed-node1 41016 1727204216.88336: ^ task is: TASK: Include the task 'get_profile_stat.yml' 41016 1727204216.88338: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204216.88341: getting variables 41016 1727204216.88341: in VariableManager get_vars() 41016 1727204216.88354: Calling all_inventory to load vars for managed-node1 41016 1727204216.88356: Calling groups_inventory to load vars for managed-node1 41016 1727204216.88364: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204216.88370: Calling all_plugins_play to load vars for managed-node1 41016 1727204216.88372: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204216.88376: Calling groups_plugins_play to load vars for managed-node1 41016 1727204216.94125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204216.94995: done with get_vars() 41016 1727204216.95015: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Tuesday 24 September 2024 14:56:56 -0400 (0:00:00.138) 0:00:40.626 ***** 41016 1727204216.95071: entering _queue_task() for managed-node1/include_tasks 41016 1727204216.95349: worker is 1 (out of 1 available) 41016 1727204216.95362: exiting _queue_task() for managed-node1/include_tasks 41016 1727204216.95377: done queuing things up, now waiting for results queue to drain 41016 1727204216.95379: waiting for pending results... 41016 1727204216.95559: running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' 41016 1727204216.95644: in run() - task 028d2410-947f-12d5-0ec4-000000000a6c 41016 1727204216.95656: variable 'ansible_search_path' from source: unknown 41016 1727204216.95660: variable 'ansible_search_path' from source: unknown 41016 1727204216.95693: calling self._execute() 41016 1727204216.95770: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204216.95774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204216.95786: variable 'omit' from source: magic vars 41016 1727204216.96083: variable 'ansible_distribution_major_version' from source: facts 41016 1727204216.96092: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204216.96099: _execute() done 41016 1727204216.96101: dumping result to json 41016 1727204216.96107: done dumping result, returning 41016 1727204216.96115: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' [028d2410-947f-12d5-0ec4-000000000a6c] 41016 1727204216.96118: sending task result for task 028d2410-947f-12d5-0ec4-000000000a6c 41016 1727204216.96204: done sending task result for task 028d2410-947f-12d5-0ec4-000000000a6c 41016 1727204216.96206: WORKER PROCESS EXITING 41016 1727204216.96241: no more pending results, returning what we have 41016 1727204216.96247: in VariableManager get_vars() 41016 1727204216.96295: Calling all_inventory to load vars for managed-node1 41016 1727204216.96298: Calling groups_inventory to load vars for managed-node1 41016 1727204216.96300: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204216.96315: Calling all_plugins_play to load vars for managed-node1 41016 1727204216.96320: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204216.96323: Calling groups_plugins_play to load vars for managed-node1 41016 1727204216.97534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204216.98606: done with get_vars() 41016 1727204216.98621: variable 'ansible_search_path' from source: unknown 41016 1727204216.98622: variable 'ansible_search_path' from source: unknown 41016 1727204216.98649: we have included files to process 41016 1727204216.98649: generating all_blocks data 41016 1727204216.98651: done generating all_blocks data 41016 1727204216.98652: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 41016 1727204216.98652: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 41016 1727204216.98654: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 41016 1727204216.99355: done processing included file 41016 1727204216.99356: iterating over new_blocks loaded from include file 41016 1727204216.99358: in VariableManager get_vars() 41016 1727204216.99376: done with get_vars() 41016 1727204216.99378: filtering new block on tags 41016 1727204216.99450: done filtering new block on tags 41016 1727204216.99453: in VariableManager get_vars() 41016 1727204216.99464: done with get_vars() 41016 1727204216.99465: filtering new block on tags 41016 1727204216.99501: done filtering new block on tags 41016 1727204216.99504: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node1 41016 1727204216.99508: extending task lists for all hosts with included blocks 41016 1727204216.99584: done extending task lists 41016 1727204216.99586: done processing included files 41016 1727204216.99587: results queue empty 41016 1727204216.99587: checking for any_errors_fatal 41016 1727204216.99590: done checking for any_errors_fatal 41016 1727204216.99590: checking for max_fail_percentage 41016 1727204216.99591: done checking for max_fail_percentage 41016 1727204216.99592: checking to see if all hosts have failed and the running result is not ok 41016 1727204216.99592: done checking to see if all hosts have failed 41016 1727204216.99593: getting the remaining hosts for this loop 41016 1727204216.99594: done getting the remaining hosts for this loop 41016 1727204216.99595: getting the next task for host managed-node1 41016 1727204216.99598: done getting next task for host managed-node1 41016 1727204216.99600: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 41016 1727204216.99602: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204216.99603: getting variables 41016 1727204216.99604: in VariableManager get_vars() 41016 1727204216.99613: Calling all_inventory to load vars for managed-node1 41016 1727204216.99615: Calling groups_inventory to load vars for managed-node1 41016 1727204216.99616: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204216.99620: Calling all_plugins_play to load vars for managed-node1 41016 1727204216.99622: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204216.99623: Calling groups_plugins_play to load vars for managed-node1 41016 1727204217.00250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204217.01510: done with get_vars() 41016 1727204217.01538: done getting variables 41016 1727204217.01586: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:56:57 -0400 (0:00:00.065) 0:00:40.692 ***** 41016 1727204217.01624: entering _queue_task() for managed-node1/set_fact 41016 1727204217.02006: worker is 1 (out of 1 available) 41016 1727204217.02017: exiting _queue_task() for managed-node1/set_fact 41016 1727204217.02030: done queuing things up, now waiting for results queue to drain 41016 1727204217.02032: waiting for pending results... 41016 1727204217.02398: running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag 41016 1727204217.02582: in run() - task 028d2410-947f-12d5-0ec4-000000000b3c 41016 1727204217.02586: variable 'ansible_search_path' from source: unknown 41016 1727204217.02590: variable 'ansible_search_path' from source: unknown 41016 1727204217.02593: calling self._execute() 41016 1727204217.02634: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204217.02647: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204217.02663: variable 'omit' from source: magic vars 41016 1727204217.03097: variable 'ansible_distribution_major_version' from source: facts 41016 1727204217.03151: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204217.03155: variable 'omit' from source: magic vars 41016 1727204217.03194: variable 'omit' from source: magic vars 41016 1727204217.03236: variable 'omit' from source: magic vars 41016 1727204217.03295: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204217.03370: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204217.03375: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204217.03400: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204217.03420: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204217.03480: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204217.03483: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204217.03486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204217.03681: Set connection var ansible_shell_executable to /bin/sh 41016 1727204217.03686: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204217.03689: Set connection var ansible_shell_type to sh 41016 1727204217.03692: Set connection var ansible_timeout to 10 41016 1727204217.03694: Set connection var ansible_pipelining to False 41016 1727204217.03697: Set connection var ansible_connection to ssh 41016 1727204217.03699: variable 'ansible_shell_executable' from source: unknown 41016 1727204217.03701: variable 'ansible_connection' from source: unknown 41016 1727204217.03704: variable 'ansible_module_compression' from source: unknown 41016 1727204217.03706: variable 'ansible_shell_type' from source: unknown 41016 1727204217.03708: variable 'ansible_shell_executable' from source: unknown 41016 1727204217.03713: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204217.03715: variable 'ansible_pipelining' from source: unknown 41016 1727204217.03716: variable 'ansible_timeout' from source: unknown 41016 1727204217.03718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204217.03949: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204217.03953: variable 'omit' from source: magic vars 41016 1727204217.03956: starting attempt loop 41016 1727204217.03958: running the handler 41016 1727204217.03959: handler run complete 41016 1727204217.03961: attempt loop complete, returning result 41016 1727204217.03963: _execute() done 41016 1727204217.03965: dumping result to json 41016 1727204217.03967: done dumping result, returning 41016 1727204217.03969: done running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag [028d2410-947f-12d5-0ec4-000000000b3c] 41016 1727204217.03971: sending task result for task 028d2410-947f-12d5-0ec4-000000000b3c ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 41016 1727204217.04343: no more pending results, returning what we have 41016 1727204217.04346: results queue empty 41016 1727204217.04347: checking for any_errors_fatal 41016 1727204217.04348: done checking for any_errors_fatal 41016 1727204217.04349: checking for max_fail_percentage 41016 1727204217.04351: done checking for max_fail_percentage 41016 1727204217.04352: checking to see if all hosts have failed and the running result is not ok 41016 1727204217.04353: done checking to see if all hosts have failed 41016 1727204217.04353: getting the remaining hosts for this loop 41016 1727204217.04355: done getting the remaining hosts for this loop 41016 1727204217.04358: getting the next task for host managed-node1 41016 1727204217.04365: done getting next task for host managed-node1 41016 1727204217.04367: ^ task is: TASK: Stat profile file 41016 1727204217.04372: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204217.04374: getting variables 41016 1727204217.04378: in VariableManager get_vars() 41016 1727204217.04416: Calling all_inventory to load vars for managed-node1 41016 1727204217.04419: Calling groups_inventory to load vars for managed-node1 41016 1727204217.04422: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204217.04432: Calling all_plugins_play to load vars for managed-node1 41016 1727204217.04435: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204217.04438: Calling groups_plugins_play to load vars for managed-node1 41016 1727204217.04955: done sending task result for task 028d2410-947f-12d5-0ec4-000000000b3c 41016 1727204217.04960: WORKER PROCESS EXITING 41016 1727204217.06057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204217.07886: done with get_vars() 41016 1727204217.07914: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:56:57 -0400 (0:00:00.064) 0:00:40.756 ***** 41016 1727204217.08027: entering _queue_task() for managed-node1/stat 41016 1727204217.08582: worker is 1 (out of 1 available) 41016 1727204217.08594: exiting _queue_task() for managed-node1/stat 41016 1727204217.08604: done queuing things up, now waiting for results queue to drain 41016 1727204217.08606: waiting for pending results... 41016 1727204217.08799: running TaskExecutor() for managed-node1/TASK: Stat profile file 41016 1727204217.08920: in run() - task 028d2410-947f-12d5-0ec4-000000000b3d 41016 1727204217.08949: variable 'ansible_search_path' from source: unknown 41016 1727204217.08958: variable 'ansible_search_path' from source: unknown 41016 1727204217.09005: calling self._execute() 41016 1727204217.09122: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204217.09162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204217.09167: variable 'omit' from source: magic vars 41016 1727204217.09613: variable 'ansible_distribution_major_version' from source: facts 41016 1727204217.09635: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204217.09705: variable 'omit' from source: magic vars 41016 1727204217.09719: variable 'omit' from source: magic vars 41016 1727204217.09832: variable 'profile' from source: include params 41016 1727204217.09841: variable 'item' from source: include params 41016 1727204217.09930: variable 'item' from source: include params 41016 1727204217.09953: variable 'omit' from source: magic vars 41016 1727204217.10004: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204217.10091: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204217.10094: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204217.10101: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204217.10118: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204217.10163: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204217.10172: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204217.10181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204217.10289: Set connection var ansible_shell_executable to /bin/sh 41016 1727204217.10307: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204217.10354: Set connection var ansible_shell_type to sh 41016 1727204217.10357: Set connection var ansible_timeout to 10 41016 1727204217.10359: Set connection var ansible_pipelining to False 41016 1727204217.10361: Set connection var ansible_connection to ssh 41016 1727204217.10362: variable 'ansible_shell_executable' from source: unknown 41016 1727204217.10364: variable 'ansible_connection' from source: unknown 41016 1727204217.10367: variable 'ansible_module_compression' from source: unknown 41016 1727204217.10373: variable 'ansible_shell_type' from source: unknown 41016 1727204217.10380: variable 'ansible_shell_executable' from source: unknown 41016 1727204217.10385: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204217.10390: variable 'ansible_pipelining' from source: unknown 41016 1727204217.10395: variable 'ansible_timeout' from source: unknown 41016 1727204217.10401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204217.10618: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 41016 1727204217.10646: variable 'omit' from source: magic vars 41016 1727204217.10684: starting attempt loop 41016 1727204217.10687: running the handler 41016 1727204217.10689: _low_level_execute_command(): starting 41016 1727204217.10696: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204217.11501: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204217.11573: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204217.11579: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204217.11771: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204217.11831: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204217.13657: stdout chunk (state=3): >>>/root <<< 41016 1727204217.13818: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204217.13821: stdout chunk (state=3): >>><<< 41016 1727204217.13824: stderr chunk (state=3): >>><<< 41016 1727204217.13845: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204217.13869: _low_level_execute_command(): starting 41016 1727204217.13904: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204217.1385477-43627-7631365591294 `" && echo ansible-tmp-1727204217.1385477-43627-7631365591294="` echo /root/.ansible/tmp/ansible-tmp-1727204217.1385477-43627-7631365591294 `" ) && sleep 0' 41016 1727204217.14498: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204217.14598: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204217.14628: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204217.14644: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204217.14668: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204217.14785: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204217.16914: stdout chunk (state=3): >>>ansible-tmp-1727204217.1385477-43627-7631365591294=/root/.ansible/tmp/ansible-tmp-1727204217.1385477-43627-7631365591294 <<< 41016 1727204217.17017: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204217.17044: stderr chunk (state=3): >>><<< 41016 1727204217.17047: stdout chunk (state=3): >>><<< 41016 1727204217.17063: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204217.1385477-43627-7631365591294=/root/.ansible/tmp/ansible-tmp-1727204217.1385477-43627-7631365591294 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204217.17108: variable 'ansible_module_compression' from source: unknown 41016 1727204217.17154: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-410168h8uvyln/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 41016 1727204217.17197: variable 'ansible_facts' from source: unknown 41016 1727204217.17258: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204217.1385477-43627-7631365591294/AnsiballZ_stat.py 41016 1727204217.17356: Sending initial data 41016 1727204217.17359: Sent initial data (151 bytes) 41016 1727204217.17755: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204217.17759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 41016 1727204217.17789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204217.17792: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204217.17795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204217.17849: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204217.17855: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204217.17857: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204217.17935: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204217.19928: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204217.20005: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204217.20121: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmpvwo1ltoj /root/.ansible/tmp/ansible-tmp-1727204217.1385477-43627-7631365591294/AnsiballZ_stat.py <<< 41016 1727204217.20125: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204217.1385477-43627-7631365591294/AnsiballZ_stat.py" <<< 41016 1727204217.20188: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmpvwo1ltoj" to remote "/root/.ansible/tmp/ansible-tmp-1727204217.1385477-43627-7631365591294/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204217.1385477-43627-7631365591294/AnsiballZ_stat.py" <<< 41016 1727204217.21644: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204217.21648: stdout chunk (state=3): >>><<< 41016 1727204217.21650: stderr chunk (state=3): >>><<< 41016 1727204217.21652: done transferring module to remote 41016 1727204217.21653: _low_level_execute_command(): starting 41016 1727204217.21656: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204217.1385477-43627-7631365591294/ /root/.ansible/tmp/ansible-tmp-1727204217.1385477-43627-7631365591294/AnsiballZ_stat.py && sleep 0' 41016 1727204217.22213: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204217.22293: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204217.22329: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204217.22347: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204217.22366: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204217.22475: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204217.24625: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204217.24647: stdout chunk (state=3): >>><<< 41016 1727204217.24655: stderr chunk (state=3): >>><<< 41016 1727204217.24783: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204217.24788: _low_level_execute_command(): starting 41016 1727204217.24790: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204217.1385477-43627-7631365591294/AnsiballZ_stat.py && sleep 0' 41016 1727204217.25371: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204217.25433: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204217.25498: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204217.25525: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204217.25555: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204217.25670: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204217.42326: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 41016 1727204217.43906: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 41016 1727204217.43925: stderr chunk (state=3): >>><<< 41016 1727204217.43928: stdout chunk (state=3): >>><<< 41016 1727204217.43943: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204217.43968: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204217.1385477-43627-7631365591294/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204217.43979: _low_level_execute_command(): starting 41016 1727204217.43994: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204217.1385477-43627-7631365591294/ > /dev/null 2>&1 && sleep 0' 41016 1727204217.44507: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204217.44536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204217.44539: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204217.44541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204217.44627: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204217.44736: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204217.46745: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204217.46749: stdout chunk (state=3): >>><<< 41016 1727204217.46751: stderr chunk (state=3): >>><<< 41016 1727204217.46768: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204217.46776: handler run complete 41016 1727204217.46804: attempt loop complete, returning result 41016 1727204217.46807: _execute() done 41016 1727204217.46809: dumping result to json 41016 1727204217.46815: done dumping result, returning 41016 1727204217.46845: done running TaskExecutor() for managed-node1/TASK: Stat profile file [028d2410-947f-12d5-0ec4-000000000b3d] 41016 1727204217.46852: sending task result for task 028d2410-947f-12d5-0ec4-000000000b3d 41016 1727204217.46985: done sending task result for task 028d2410-947f-12d5-0ec4-000000000b3d 41016 1727204217.46988: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 41016 1727204217.47072: no more pending results, returning what we have 41016 1727204217.47079: results queue empty 41016 1727204217.47080: checking for any_errors_fatal 41016 1727204217.47087: done checking for any_errors_fatal 41016 1727204217.47088: checking for max_fail_percentage 41016 1727204217.47090: done checking for max_fail_percentage 41016 1727204217.47091: checking to see if all hosts have failed and the running result is not ok 41016 1727204217.47091: done checking to see if all hosts have failed 41016 1727204217.47092: getting the remaining hosts for this loop 41016 1727204217.47094: done getting the remaining hosts for this loop 41016 1727204217.47097: getting the next task for host managed-node1 41016 1727204217.47108: done getting next task for host managed-node1 41016 1727204217.47111: ^ task is: TASK: Set NM profile exist flag based on the profile files 41016 1727204217.47119: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204217.47124: getting variables 41016 1727204217.47126: in VariableManager get_vars() 41016 1727204217.47238: Calling all_inventory to load vars for managed-node1 41016 1727204217.47276: Calling groups_inventory to load vars for managed-node1 41016 1727204217.47280: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204217.47315: Calling all_plugins_play to load vars for managed-node1 41016 1727204217.47342: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204217.47347: Calling groups_plugins_play to load vars for managed-node1 41016 1727204217.48643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204217.49986: done with get_vars() 41016 1727204217.50013: done getting variables 41016 1727204217.50072: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:56:57 -0400 (0:00:00.420) 0:00:41.177 ***** 41016 1727204217.50122: entering _queue_task() for managed-node1/set_fact 41016 1727204217.50450: worker is 1 (out of 1 available) 41016 1727204217.50463: exiting _queue_task() for managed-node1/set_fact 41016 1727204217.50692: done queuing things up, now waiting for results queue to drain 41016 1727204217.50694: waiting for pending results... 41016 1727204217.50895: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files 41016 1727204217.50961: in run() - task 028d2410-947f-12d5-0ec4-000000000b3e 41016 1727204217.50971: variable 'ansible_search_path' from source: unknown 41016 1727204217.50978: variable 'ansible_search_path' from source: unknown 41016 1727204217.51009: calling self._execute() 41016 1727204217.51084: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204217.51088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204217.51098: variable 'omit' from source: magic vars 41016 1727204217.51405: variable 'ansible_distribution_major_version' from source: facts 41016 1727204217.51417: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204217.51507: variable 'profile_stat' from source: set_fact 41016 1727204217.51517: Evaluated conditional (profile_stat.stat.exists): False 41016 1727204217.51520: when evaluation is False, skipping this task 41016 1727204217.51524: _execute() done 41016 1727204217.51526: dumping result to json 41016 1727204217.51529: done dumping result, returning 41016 1727204217.51535: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files [028d2410-947f-12d5-0ec4-000000000b3e] 41016 1727204217.51539: sending task result for task 028d2410-947f-12d5-0ec4-000000000b3e 41016 1727204217.51627: done sending task result for task 028d2410-947f-12d5-0ec4-000000000b3e 41016 1727204217.51630: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 41016 1727204217.51678: no more pending results, returning what we have 41016 1727204217.51682: results queue empty 41016 1727204217.51683: checking for any_errors_fatal 41016 1727204217.51693: done checking for any_errors_fatal 41016 1727204217.51694: checking for max_fail_percentage 41016 1727204217.51696: done checking for max_fail_percentage 41016 1727204217.51697: checking to see if all hosts have failed and the running result is not ok 41016 1727204217.51697: done checking to see if all hosts have failed 41016 1727204217.51698: getting the remaining hosts for this loop 41016 1727204217.51700: done getting the remaining hosts for this loop 41016 1727204217.51703: getting the next task for host managed-node1 41016 1727204217.51714: done getting next task for host managed-node1 41016 1727204217.51716: ^ task is: TASK: Get NM profile info 41016 1727204217.51722: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204217.51726: getting variables 41016 1727204217.51728: in VariableManager get_vars() 41016 1727204217.51767: Calling all_inventory to load vars for managed-node1 41016 1727204217.51770: Calling groups_inventory to load vars for managed-node1 41016 1727204217.51772: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204217.51783: Calling all_plugins_play to load vars for managed-node1 41016 1727204217.51785: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204217.51788: Calling groups_plugins_play to load vars for managed-node1 41016 1727204217.52707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204217.53772: done with get_vars() 41016 1727204217.53800: done getting variables 41016 1727204217.53901: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:56:57 -0400 (0:00:00.038) 0:00:41.215 ***** 41016 1727204217.53936: entering _queue_task() for managed-node1/shell 41016 1727204217.53938: Creating lock for shell 41016 1727204217.54267: worker is 1 (out of 1 available) 41016 1727204217.54382: exiting _queue_task() for managed-node1/shell 41016 1727204217.54394: done queuing things up, now waiting for results queue to drain 41016 1727204217.54395: waiting for pending results... 41016 1727204217.54648: running TaskExecutor() for managed-node1/TASK: Get NM profile info 41016 1727204217.54730: in run() - task 028d2410-947f-12d5-0ec4-000000000b3f 41016 1727204217.54733: variable 'ansible_search_path' from source: unknown 41016 1727204217.54736: variable 'ansible_search_path' from source: unknown 41016 1727204217.54744: calling self._execute() 41016 1727204217.54842: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204217.54880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204217.54886: variable 'omit' from source: magic vars 41016 1727204217.55272: variable 'ansible_distribution_major_version' from source: facts 41016 1727204217.55281: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204217.55285: variable 'omit' from source: magic vars 41016 1727204217.55319: variable 'omit' from source: magic vars 41016 1727204217.55430: variable 'profile' from source: include params 41016 1727204217.55433: variable 'item' from source: include params 41016 1727204217.55599: variable 'item' from source: include params 41016 1727204217.55604: variable 'omit' from source: magic vars 41016 1727204217.55607: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204217.55637: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204217.55655: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204217.55673: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204217.55709: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204217.55724: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204217.55728: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204217.55730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204217.55880: Set connection var ansible_shell_executable to /bin/sh 41016 1727204217.55883: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204217.55886: Set connection var ansible_shell_type to sh 41016 1727204217.55888: Set connection var ansible_timeout to 10 41016 1727204217.55890: Set connection var ansible_pipelining to False 41016 1727204217.55892: Set connection var ansible_connection to ssh 41016 1727204217.55912: variable 'ansible_shell_executable' from source: unknown 41016 1727204217.55915: variable 'ansible_connection' from source: unknown 41016 1727204217.55918: variable 'ansible_module_compression' from source: unknown 41016 1727204217.56034: variable 'ansible_shell_type' from source: unknown 41016 1727204217.56038: variable 'ansible_shell_executable' from source: unknown 41016 1727204217.56041: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204217.56043: variable 'ansible_pipelining' from source: unknown 41016 1727204217.56048: variable 'ansible_timeout' from source: unknown 41016 1727204217.56050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204217.56072: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204217.56145: variable 'omit' from source: magic vars 41016 1727204217.56148: starting attempt loop 41016 1727204217.56150: running the handler 41016 1727204217.56153: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204217.56157: _low_level_execute_command(): starting 41016 1727204217.56165: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204217.56967: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204217.56995: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204217.57118: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204217.59285: stdout chunk (state=3): >>>/root <<< 41016 1727204217.59289: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204217.59291: stdout chunk (state=3): >>><<< 41016 1727204217.59293: stderr chunk (state=3): >>><<< 41016 1727204217.59296: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204217.59299: _low_level_execute_command(): starting 41016 1727204217.59301: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204217.5921347-43648-220991914182160 `" && echo ansible-tmp-1727204217.5921347-43648-220991914182160="` echo /root/.ansible/tmp/ansible-tmp-1727204217.5921347-43648-220991914182160 `" ) && sleep 0' 41016 1727204217.60638: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204217.60893: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204217.60917: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204217.60961: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204217.61040: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204217.63143: stdout chunk (state=3): >>>ansible-tmp-1727204217.5921347-43648-220991914182160=/root/.ansible/tmp/ansible-tmp-1727204217.5921347-43648-220991914182160 <<< 41016 1727204217.63283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204217.63293: stdout chunk (state=3): >>><<< 41016 1727204217.63306: stderr chunk (state=3): >>><<< 41016 1727204217.63331: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204217.5921347-43648-220991914182160=/root/.ansible/tmp/ansible-tmp-1727204217.5921347-43648-220991914182160 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204217.63368: variable 'ansible_module_compression' from source: unknown 41016 1727204217.63432: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-410168h8uvyln/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41016 1727204217.63474: variable 'ansible_facts' from source: unknown 41016 1727204217.63566: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204217.5921347-43648-220991914182160/AnsiballZ_command.py 41016 1727204217.63698: Sending initial data 41016 1727204217.63803: Sent initial data (156 bytes) 41016 1727204217.64308: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204217.64323: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204217.64396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204217.64443: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204217.64460: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204217.64478: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204217.64585: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204217.66373: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204217.66444: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204217.66544: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmp011t86yk /root/.ansible/tmp/ansible-tmp-1727204217.5921347-43648-220991914182160/AnsiballZ_command.py <<< 41016 1727204217.66547: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204217.5921347-43648-220991914182160/AnsiballZ_command.py" <<< 41016 1727204217.66626: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmp011t86yk" to remote "/root/.ansible/tmp/ansible-tmp-1727204217.5921347-43648-220991914182160/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204217.5921347-43648-220991914182160/AnsiballZ_command.py" <<< 41016 1727204217.67582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204217.67592: stdout chunk (state=3): >>><<< 41016 1727204217.67701: stderr chunk (state=3): >>><<< 41016 1727204217.67705: done transferring module to remote 41016 1727204217.67707: _low_level_execute_command(): starting 41016 1727204217.67709: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204217.5921347-43648-220991914182160/ /root/.ansible/tmp/ansible-tmp-1727204217.5921347-43648-220991914182160/AnsiballZ_command.py && sleep 0' 41016 1727204217.68306: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204217.68323: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204217.68337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204217.68361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204217.68482: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204217.68504: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204217.68624: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204217.70646: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204217.70650: stdout chunk (state=3): >>><<< 41016 1727204217.70657: stderr chunk (state=3): >>><<< 41016 1727204217.70755: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204217.70759: _low_level_execute_command(): starting 41016 1727204217.70762: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204217.5921347-43648-220991914182160/AnsiballZ_command.py && sleep 0' 41016 1727204217.71320: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204217.71334: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204217.71347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204217.71361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204217.71377: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204217.71389: stderr chunk (state=3): >>>debug2: match not found <<< 41016 1727204217.71494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204217.71640: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204217.71765: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204217.90015: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-24 14:56:57.880676", "end": "2024-09-24 14:56:57.897678", "delta": "0:00:00.017002", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41016 1727204217.91809: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.14.47 closed. <<< 41016 1727204217.91813: stdout chunk (state=3): >>><<< 41016 1727204217.91816: stderr chunk (state=3): >>><<< 41016 1727204217.91922: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-24 14:56:57.880676", "end": "2024-09-24 14:56:57.897678", "delta": "0:00:00.017002", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.14.47 closed. 41016 1727204217.91959: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204217.5921347-43648-220991914182160/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204217.91968: _low_level_execute_command(): starting 41016 1727204217.91974: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204217.5921347-43648-220991914182160/ > /dev/null 2>&1 && sleep 0' 41016 1727204217.93009: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204217.93012: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204217.93015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204217.93017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204217.93019: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204217.93021: stderr chunk (state=3): >>>debug2: match not found <<< 41016 1727204217.93023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204217.93025: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41016 1727204217.93027: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 41016 1727204217.93034: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204217.93037: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204217.93050: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204217.93060: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204217.93235: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204217.95246: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204217.95250: stdout chunk (state=3): >>><<< 41016 1727204217.95260: stderr chunk (state=3): >>><<< 41016 1727204217.95425: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204217.95429: handler run complete 41016 1727204217.95431: Evaluated conditional (False): False 41016 1727204217.95440: attempt loop complete, returning result 41016 1727204217.95442: _execute() done 41016 1727204217.95445: dumping result to json 41016 1727204217.95446: done dumping result, returning 41016 1727204217.95481: done running TaskExecutor() for managed-node1/TASK: Get NM profile info [028d2410-947f-12d5-0ec4-000000000b3f] 41016 1727204217.95484: sending task result for task 028d2410-947f-12d5-0ec4-000000000b3f 41016 1727204217.95732: done sending task result for task 028d2410-947f-12d5-0ec4-000000000b3f 41016 1727204217.95735: WORKER PROCESS EXITING fatal: [managed-node1]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "delta": "0:00:00.017002", "end": "2024-09-24 14:56:57.897678", "rc": 1, "start": "2024-09-24 14:56:57.880676" } MSG: non-zero return code ...ignoring 41016 1727204217.95816: no more pending results, returning what we have 41016 1727204217.95820: results queue empty 41016 1727204217.95821: checking for any_errors_fatal 41016 1727204217.95829: done checking for any_errors_fatal 41016 1727204217.95830: checking for max_fail_percentage 41016 1727204217.95832: done checking for max_fail_percentage 41016 1727204217.95832: checking to see if all hosts have failed and the running result is not ok 41016 1727204217.95833: done checking to see if all hosts have failed 41016 1727204217.95833: getting the remaining hosts for this loop 41016 1727204217.95835: done getting the remaining hosts for this loop 41016 1727204217.95838: getting the next task for host managed-node1 41016 1727204217.95855: done getting next task for host managed-node1 41016 1727204217.95857: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 41016 1727204217.95862: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204217.95866: getting variables 41016 1727204217.95868: in VariableManager get_vars() 41016 1727204217.95913: Calling all_inventory to load vars for managed-node1 41016 1727204217.95916: Calling groups_inventory to load vars for managed-node1 41016 1727204217.95919: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204217.95929: Calling all_plugins_play to load vars for managed-node1 41016 1727204217.95932: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204217.95934: Calling groups_plugins_play to load vars for managed-node1 41016 1727204217.97832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204218.00599: done with get_vars() 41016 1727204218.00624: done getting variables 41016 1727204218.00813: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:56:58 -0400 (0:00:00.469) 0:00:41.684 ***** 41016 1727204218.00962: entering _queue_task() for managed-node1/set_fact 41016 1727204218.01799: worker is 1 (out of 1 available) 41016 1727204218.01822: exiting _queue_task() for managed-node1/set_fact 41016 1727204218.01834: done queuing things up, now waiting for results queue to drain 41016 1727204218.01835: waiting for pending results... 41016 1727204218.02164: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 41016 1727204218.02218: in run() - task 028d2410-947f-12d5-0ec4-000000000b40 41016 1727204218.02239: variable 'ansible_search_path' from source: unknown 41016 1727204218.02256: variable 'ansible_search_path' from source: unknown 41016 1727204218.02300: calling self._execute() 41016 1727204218.02416: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204218.02428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204218.02441: variable 'omit' from source: magic vars 41016 1727204218.02862: variable 'ansible_distribution_major_version' from source: facts 41016 1727204218.02917: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204218.03053: variable 'nm_profile_exists' from source: set_fact 41016 1727204218.03073: Evaluated conditional (nm_profile_exists.rc == 0): False 41016 1727204218.03085: when evaluation is False, skipping this task 41016 1727204218.03133: _execute() done 41016 1727204218.03137: dumping result to json 41016 1727204218.03139: done dumping result, returning 41016 1727204218.03142: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [028d2410-947f-12d5-0ec4-000000000b40] 41016 1727204218.03144: sending task result for task 028d2410-947f-12d5-0ec4-000000000b40 skipping: [managed-node1] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 41016 1727204218.03405: no more pending results, returning what we have 41016 1727204218.03410: results queue empty 41016 1727204218.03411: checking for any_errors_fatal 41016 1727204218.03422: done checking for any_errors_fatal 41016 1727204218.03422: checking for max_fail_percentage 41016 1727204218.03424: done checking for max_fail_percentage 41016 1727204218.03426: checking to see if all hosts have failed and the running result is not ok 41016 1727204218.03427: done checking to see if all hosts have failed 41016 1727204218.03427: getting the remaining hosts for this loop 41016 1727204218.03429: done getting the remaining hosts for this loop 41016 1727204218.03433: getting the next task for host managed-node1 41016 1727204218.03444: done getting next task for host managed-node1 41016 1727204218.03457: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 41016 1727204218.03464: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204218.03480: getting variables 41016 1727204218.03483: in VariableManager get_vars() 41016 1727204218.03611: Calling all_inventory to load vars for managed-node1 41016 1727204218.03615: Calling groups_inventory to load vars for managed-node1 41016 1727204218.03618: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204218.03663: done sending task result for task 028d2410-947f-12d5-0ec4-000000000b40 41016 1727204218.03698: WORKER PROCESS EXITING 41016 1727204218.03896: Calling all_plugins_play to load vars for managed-node1 41016 1727204218.03903: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204218.03908: Calling groups_plugins_play to load vars for managed-node1 41016 1727204218.06713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204218.08602: done with get_vars() 41016 1727204218.08627: done getting variables 41016 1727204218.08674: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41016 1727204218.08769: variable 'profile' from source: include params 41016 1727204218.08773: variable 'item' from source: include params 41016 1727204218.08822: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-ethtest0] *********************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:56:58 -0400 (0:00:00.079) 0:00:41.764 ***** 41016 1727204218.08846: entering _queue_task() for managed-node1/command 41016 1727204218.09111: worker is 1 (out of 1 available) 41016 1727204218.09124: exiting _queue_task() for managed-node1/command 41016 1727204218.09137: done queuing things up, now waiting for results queue to drain 41016 1727204218.09138: waiting for pending results... 41016 1727204218.09321: running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-ethtest0 41016 1727204218.09407: in run() - task 028d2410-947f-12d5-0ec4-000000000b42 41016 1727204218.09418: variable 'ansible_search_path' from source: unknown 41016 1727204218.09422: variable 'ansible_search_path' from source: unknown 41016 1727204218.09451: calling self._execute() 41016 1727204218.09534: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204218.09540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204218.09549: variable 'omit' from source: magic vars 41016 1727204218.09831: variable 'ansible_distribution_major_version' from source: facts 41016 1727204218.09839: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204218.09922: variable 'profile_stat' from source: set_fact 41016 1727204218.09933: Evaluated conditional (profile_stat.stat.exists): False 41016 1727204218.09936: when evaluation is False, skipping this task 41016 1727204218.09939: _execute() done 41016 1727204218.09941: dumping result to json 41016 1727204218.09944: done dumping result, returning 41016 1727204218.09950: done running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-ethtest0 [028d2410-947f-12d5-0ec4-000000000b42] 41016 1727204218.09954: sending task result for task 028d2410-947f-12d5-0ec4-000000000b42 41016 1727204218.10048: done sending task result for task 028d2410-947f-12d5-0ec4-000000000b42 41016 1727204218.10051: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 41016 1727204218.10112: no more pending results, returning what we have 41016 1727204218.10118: results queue empty 41016 1727204218.10119: checking for any_errors_fatal 41016 1727204218.10126: done checking for any_errors_fatal 41016 1727204218.10127: checking for max_fail_percentage 41016 1727204218.10129: done checking for max_fail_percentage 41016 1727204218.10129: checking to see if all hosts have failed and the running result is not ok 41016 1727204218.10130: done checking to see if all hosts have failed 41016 1727204218.10131: getting the remaining hosts for this loop 41016 1727204218.10132: done getting the remaining hosts for this loop 41016 1727204218.10136: getting the next task for host managed-node1 41016 1727204218.10145: done getting next task for host managed-node1 41016 1727204218.10147: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 41016 1727204218.10152: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204218.10157: getting variables 41016 1727204218.10158: in VariableManager get_vars() 41016 1727204218.10383: Calling all_inventory to load vars for managed-node1 41016 1727204218.10386: Calling groups_inventory to load vars for managed-node1 41016 1727204218.10389: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204218.10399: Calling all_plugins_play to load vars for managed-node1 41016 1727204218.10402: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204218.10405: Calling groups_plugins_play to load vars for managed-node1 41016 1727204218.11896: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204218.13222: done with get_vars() 41016 1727204218.13246: done getting variables 41016 1727204218.13306: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41016 1727204218.13416: variable 'profile' from source: include params 41016 1727204218.13422: variable 'item' from source: include params 41016 1727204218.13462: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-ethtest0] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:56:58 -0400 (0:00:00.046) 0:00:41.811 ***** 41016 1727204218.13502: entering _queue_task() for managed-node1/set_fact 41016 1727204218.13741: worker is 1 (out of 1 available) 41016 1727204218.13753: exiting _queue_task() for managed-node1/set_fact 41016 1727204218.13767: done queuing things up, now waiting for results queue to drain 41016 1727204218.13769: waiting for pending results... 41016 1727204218.13965: running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 41016 1727204218.14044: in run() - task 028d2410-947f-12d5-0ec4-000000000b43 41016 1727204218.14056: variable 'ansible_search_path' from source: unknown 41016 1727204218.14060: variable 'ansible_search_path' from source: unknown 41016 1727204218.14088: calling self._execute() 41016 1727204218.14166: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204218.14169: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204218.14181: variable 'omit' from source: magic vars 41016 1727204218.14449: variable 'ansible_distribution_major_version' from source: facts 41016 1727204218.14458: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204218.14545: variable 'profile_stat' from source: set_fact 41016 1727204218.14555: Evaluated conditional (profile_stat.stat.exists): False 41016 1727204218.14557: when evaluation is False, skipping this task 41016 1727204218.14560: _execute() done 41016 1727204218.14563: dumping result to json 41016 1727204218.14567: done dumping result, returning 41016 1727204218.14572: done running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 [028d2410-947f-12d5-0ec4-000000000b43] 41016 1727204218.14584: sending task result for task 028d2410-947f-12d5-0ec4-000000000b43 41016 1727204218.14663: done sending task result for task 028d2410-947f-12d5-0ec4-000000000b43 41016 1727204218.14666: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 41016 1727204218.14717: no more pending results, returning what we have 41016 1727204218.14721: results queue empty 41016 1727204218.14722: checking for any_errors_fatal 41016 1727204218.14729: done checking for any_errors_fatal 41016 1727204218.14729: checking for max_fail_percentage 41016 1727204218.14731: done checking for max_fail_percentage 41016 1727204218.14732: checking to see if all hosts have failed and the running result is not ok 41016 1727204218.14733: done checking to see if all hosts have failed 41016 1727204218.14733: getting the remaining hosts for this loop 41016 1727204218.14735: done getting the remaining hosts for this loop 41016 1727204218.14738: getting the next task for host managed-node1 41016 1727204218.14746: done getting next task for host managed-node1 41016 1727204218.14748: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 41016 1727204218.14754: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204218.14757: getting variables 41016 1727204218.14758: in VariableManager get_vars() 41016 1727204218.14796: Calling all_inventory to load vars for managed-node1 41016 1727204218.14799: Calling groups_inventory to load vars for managed-node1 41016 1727204218.14801: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204218.14813: Calling all_plugins_play to load vars for managed-node1 41016 1727204218.14816: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204218.14818: Calling groups_plugins_play to load vars for managed-node1 41016 1727204218.15597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204218.16755: done with get_vars() 41016 1727204218.16778: done getting variables 41016 1727204218.16835: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41016 1727204218.16939: variable 'profile' from source: include params 41016 1727204218.16943: variable 'item' from source: include params 41016 1727204218.16999: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-ethtest0] *************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:56:58 -0400 (0:00:00.035) 0:00:41.846 ***** 41016 1727204218.17028: entering _queue_task() for managed-node1/command 41016 1727204218.17294: worker is 1 (out of 1 available) 41016 1727204218.17308: exiting _queue_task() for managed-node1/command 41016 1727204218.17320: done queuing things up, now waiting for results queue to drain 41016 1727204218.17322: waiting for pending results... 41016 1727204218.17512: running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-ethtest0 41016 1727204218.17591: in run() - task 028d2410-947f-12d5-0ec4-000000000b44 41016 1727204218.17601: variable 'ansible_search_path' from source: unknown 41016 1727204218.17605: variable 'ansible_search_path' from source: unknown 41016 1727204218.17632: calling self._execute() 41016 1727204218.17713: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204218.17720: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204218.17727: variable 'omit' from source: magic vars 41016 1727204218.17992: variable 'ansible_distribution_major_version' from source: facts 41016 1727204218.18002: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204218.18088: variable 'profile_stat' from source: set_fact 41016 1727204218.18098: Evaluated conditional (profile_stat.stat.exists): False 41016 1727204218.18101: when evaluation is False, skipping this task 41016 1727204218.18103: _execute() done 41016 1727204218.18106: dumping result to json 41016 1727204218.18109: done dumping result, returning 41016 1727204218.18115: done running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-ethtest0 [028d2410-947f-12d5-0ec4-000000000b44] 41016 1727204218.18122: sending task result for task 028d2410-947f-12d5-0ec4-000000000b44 41016 1727204218.18207: done sending task result for task 028d2410-947f-12d5-0ec4-000000000b44 41016 1727204218.18212: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 41016 1727204218.18260: no more pending results, returning what we have 41016 1727204218.18265: results queue empty 41016 1727204218.18266: checking for any_errors_fatal 41016 1727204218.18274: done checking for any_errors_fatal 41016 1727204218.18276: checking for max_fail_percentage 41016 1727204218.18278: done checking for max_fail_percentage 41016 1727204218.18279: checking to see if all hosts have failed and the running result is not ok 41016 1727204218.18280: done checking to see if all hosts have failed 41016 1727204218.18281: getting the remaining hosts for this loop 41016 1727204218.18282: done getting the remaining hosts for this loop 41016 1727204218.18286: getting the next task for host managed-node1 41016 1727204218.18293: done getting next task for host managed-node1 41016 1727204218.18296: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 41016 1727204218.18300: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204218.18304: getting variables 41016 1727204218.18306: in VariableManager get_vars() 41016 1727204218.18345: Calling all_inventory to load vars for managed-node1 41016 1727204218.18347: Calling groups_inventory to load vars for managed-node1 41016 1727204218.18349: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204218.18358: Calling all_plugins_play to load vars for managed-node1 41016 1727204218.18361: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204218.18363: Calling groups_plugins_play to load vars for managed-node1 41016 1727204218.19279: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204218.20139: done with get_vars() 41016 1727204218.20155: done getting variables 41016 1727204218.20198: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41016 1727204218.20285: variable 'profile' from source: include params 41016 1727204218.20289: variable 'item' from source: include params 41016 1727204218.20340: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-ethtest0] ************************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:56:58 -0400 (0:00:00.033) 0:00:41.879 ***** 41016 1727204218.20374: entering _queue_task() for managed-node1/set_fact 41016 1727204218.20661: worker is 1 (out of 1 available) 41016 1727204218.20674: exiting _queue_task() for managed-node1/set_fact 41016 1727204218.20689: done queuing things up, now waiting for results queue to drain 41016 1727204218.20690: waiting for pending results... 41016 1727204218.20943: running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-ethtest0 41016 1727204218.21067: in run() - task 028d2410-947f-12d5-0ec4-000000000b45 41016 1727204218.21108: variable 'ansible_search_path' from source: unknown 41016 1727204218.21113: variable 'ansible_search_path' from source: unknown 41016 1727204218.21140: calling self._execute() 41016 1727204218.21237: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204218.21241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204218.21254: variable 'omit' from source: magic vars 41016 1727204218.21577: variable 'ansible_distribution_major_version' from source: facts 41016 1727204218.21582: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204218.21716: variable 'profile_stat' from source: set_fact 41016 1727204218.21726: Evaluated conditional (profile_stat.stat.exists): False 41016 1727204218.21729: when evaluation is False, skipping this task 41016 1727204218.21733: _execute() done 41016 1727204218.21735: dumping result to json 41016 1727204218.21738: done dumping result, returning 41016 1727204218.21744: done running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-ethtest0 [028d2410-947f-12d5-0ec4-000000000b45] 41016 1727204218.21748: sending task result for task 028d2410-947f-12d5-0ec4-000000000b45 41016 1727204218.21835: done sending task result for task 028d2410-947f-12d5-0ec4-000000000b45 41016 1727204218.21837: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 41016 1727204218.21884: no more pending results, returning what we have 41016 1727204218.21888: results queue empty 41016 1727204218.21889: checking for any_errors_fatal 41016 1727204218.21897: done checking for any_errors_fatal 41016 1727204218.21898: checking for max_fail_percentage 41016 1727204218.21899: done checking for max_fail_percentage 41016 1727204218.21900: checking to see if all hosts have failed and the running result is not ok 41016 1727204218.21901: done checking to see if all hosts have failed 41016 1727204218.21902: getting the remaining hosts for this loop 41016 1727204218.21904: done getting the remaining hosts for this loop 41016 1727204218.21908: getting the next task for host managed-node1 41016 1727204218.21917: done getting next task for host managed-node1 41016 1727204218.21920: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 41016 1727204218.21924: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204218.21927: getting variables 41016 1727204218.21929: in VariableManager get_vars() 41016 1727204218.21967: Calling all_inventory to load vars for managed-node1 41016 1727204218.21969: Calling groups_inventory to load vars for managed-node1 41016 1727204218.21971: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204218.21983: Calling all_plugins_play to load vars for managed-node1 41016 1727204218.21985: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204218.21988: Calling groups_plugins_play to load vars for managed-node1 41016 1727204218.23024: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204218.24319: done with get_vars() 41016 1727204218.24350: done getting variables 41016 1727204218.24410: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41016 1727204218.24493: variable 'profile' from source: include params 41016 1727204218.24496: variable 'item' from source: include params 41016 1727204218.24538: variable 'item' from source: include params TASK [Assert that the profile is absent - 'ethtest0'] ************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Tuesday 24 September 2024 14:56:58 -0400 (0:00:00.041) 0:00:41.921 ***** 41016 1727204218.24560: entering _queue_task() for managed-node1/assert 41016 1727204218.24849: worker is 1 (out of 1 available) 41016 1727204218.24862: exiting _queue_task() for managed-node1/assert 41016 1727204218.24876: done queuing things up, now waiting for results queue to drain 41016 1727204218.24877: waiting for pending results... 41016 1727204218.25091: running TaskExecutor() for managed-node1/TASK: Assert that the profile is absent - 'ethtest0' 41016 1727204218.25158: in run() - task 028d2410-947f-12d5-0ec4-000000000a6d 41016 1727204218.25178: variable 'ansible_search_path' from source: unknown 41016 1727204218.25183: variable 'ansible_search_path' from source: unknown 41016 1727204218.25205: calling self._execute() 41016 1727204218.25308: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204218.25312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204218.25323: variable 'omit' from source: magic vars 41016 1727204218.25643: variable 'ansible_distribution_major_version' from source: facts 41016 1727204218.25653: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204218.25658: variable 'omit' from source: magic vars 41016 1727204218.25706: variable 'omit' from source: magic vars 41016 1727204218.25816: variable 'profile' from source: include params 41016 1727204218.25820: variable 'item' from source: include params 41016 1727204218.25874: variable 'item' from source: include params 41016 1727204218.25895: variable 'omit' from source: magic vars 41016 1727204218.25934: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204218.25973: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204218.25986: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204218.26002: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204218.26013: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204218.26048: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204218.26052: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204218.26054: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204218.26180: Set connection var ansible_shell_executable to /bin/sh 41016 1727204218.26183: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204218.26185: Set connection var ansible_shell_type to sh 41016 1727204218.26212: Set connection var ansible_timeout to 10 41016 1727204218.26216: Set connection var ansible_pipelining to False 41016 1727204218.26219: Set connection var ansible_connection to ssh 41016 1727204218.26250: variable 'ansible_shell_executable' from source: unknown 41016 1727204218.26253: variable 'ansible_connection' from source: unknown 41016 1727204218.26256: variable 'ansible_module_compression' from source: unknown 41016 1727204218.26258: variable 'ansible_shell_type' from source: unknown 41016 1727204218.26260: variable 'ansible_shell_executable' from source: unknown 41016 1727204218.26262: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204218.26264: variable 'ansible_pipelining' from source: unknown 41016 1727204218.26266: variable 'ansible_timeout' from source: unknown 41016 1727204218.26268: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204218.26380: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204218.26390: variable 'omit' from source: magic vars 41016 1727204218.26396: starting attempt loop 41016 1727204218.26398: running the handler 41016 1727204218.26496: variable 'lsr_net_profile_exists' from source: set_fact 41016 1727204218.26500: Evaluated conditional (not lsr_net_profile_exists): True 41016 1727204218.26505: handler run complete 41016 1727204218.26517: attempt loop complete, returning result 41016 1727204218.26519: _execute() done 41016 1727204218.26522: dumping result to json 41016 1727204218.26526: done dumping result, returning 41016 1727204218.26532: done running TaskExecutor() for managed-node1/TASK: Assert that the profile is absent - 'ethtest0' [028d2410-947f-12d5-0ec4-000000000a6d] 41016 1727204218.26534: sending task result for task 028d2410-947f-12d5-0ec4-000000000a6d 41016 1727204218.26620: done sending task result for task 028d2410-947f-12d5-0ec4-000000000a6d 41016 1727204218.26624: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 41016 1727204218.26670: no more pending results, returning what we have 41016 1727204218.26673: results queue empty 41016 1727204218.26674: checking for any_errors_fatal 41016 1727204218.26682: done checking for any_errors_fatal 41016 1727204218.26682: checking for max_fail_percentage 41016 1727204218.26684: done checking for max_fail_percentage 41016 1727204218.26685: checking to see if all hosts have failed and the running result is not ok 41016 1727204218.26686: done checking to see if all hosts have failed 41016 1727204218.26687: getting the remaining hosts for this loop 41016 1727204218.26689: done getting the remaining hosts for this loop 41016 1727204218.26692: getting the next task for host managed-node1 41016 1727204218.26700: done getting next task for host managed-node1 41016 1727204218.26704: ^ task is: TASK: Include the task 'get_profile_stat.yml' 41016 1727204218.26708: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204218.26714: getting variables 41016 1727204218.26716: in VariableManager get_vars() 41016 1727204218.26751: Calling all_inventory to load vars for managed-node1 41016 1727204218.26754: Calling groups_inventory to load vars for managed-node1 41016 1727204218.26756: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204218.26765: Calling all_plugins_play to load vars for managed-node1 41016 1727204218.26767: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204218.26770: Calling groups_plugins_play to load vars for managed-node1 41016 1727204218.27811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204218.28780: done with get_vars() 41016 1727204218.28796: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Tuesday 24 September 2024 14:56:58 -0400 (0:00:00.042) 0:00:41.964 ***** 41016 1727204218.28862: entering _queue_task() for managed-node1/include_tasks 41016 1727204218.29095: worker is 1 (out of 1 available) 41016 1727204218.29108: exiting _queue_task() for managed-node1/include_tasks 41016 1727204218.29121: done queuing things up, now waiting for results queue to drain 41016 1727204218.29122: waiting for pending results... 41016 1727204218.29304: running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' 41016 1727204218.29384: in run() - task 028d2410-947f-12d5-0ec4-000000000a71 41016 1727204218.29396: variable 'ansible_search_path' from source: unknown 41016 1727204218.29399: variable 'ansible_search_path' from source: unknown 41016 1727204218.29429: calling self._execute() 41016 1727204218.29510: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204218.29517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204218.29525: variable 'omit' from source: magic vars 41016 1727204218.29800: variable 'ansible_distribution_major_version' from source: facts 41016 1727204218.29811: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204218.29815: _execute() done 41016 1727204218.29821: dumping result to json 41016 1727204218.29823: done dumping result, returning 41016 1727204218.29829: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' [028d2410-947f-12d5-0ec4-000000000a71] 41016 1727204218.29835: sending task result for task 028d2410-947f-12d5-0ec4-000000000a71 41016 1727204218.29921: done sending task result for task 028d2410-947f-12d5-0ec4-000000000a71 41016 1727204218.29923: WORKER PROCESS EXITING 41016 1727204218.29949: no more pending results, returning what we have 41016 1727204218.29954: in VariableManager get_vars() 41016 1727204218.30000: Calling all_inventory to load vars for managed-node1 41016 1727204218.30003: Calling groups_inventory to load vars for managed-node1 41016 1727204218.30006: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204218.30017: Calling all_plugins_play to load vars for managed-node1 41016 1727204218.30020: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204218.30022: Calling groups_plugins_play to load vars for managed-node1 41016 1727204218.30831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204218.31782: done with get_vars() 41016 1727204218.31794: variable 'ansible_search_path' from source: unknown 41016 1727204218.31795: variable 'ansible_search_path' from source: unknown 41016 1727204218.31824: we have included files to process 41016 1727204218.31825: generating all_blocks data 41016 1727204218.31827: done generating all_blocks data 41016 1727204218.31830: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 41016 1727204218.31831: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 41016 1727204218.31832: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 41016 1727204218.32637: done processing included file 41016 1727204218.32638: iterating over new_blocks loaded from include file 41016 1727204218.32639: in VariableManager get_vars() 41016 1727204218.32653: done with get_vars() 41016 1727204218.32654: filtering new block on tags 41016 1727204218.32737: done filtering new block on tags 41016 1727204218.32740: in VariableManager get_vars() 41016 1727204218.32762: done with get_vars() 41016 1727204218.32765: filtering new block on tags 41016 1727204218.32822: done filtering new block on tags 41016 1727204218.32826: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node1 41016 1727204218.32831: extending task lists for all hosts with included blocks 41016 1727204218.32955: done extending task lists 41016 1727204218.32956: done processing included files 41016 1727204218.32957: results queue empty 41016 1727204218.32957: checking for any_errors_fatal 41016 1727204218.32959: done checking for any_errors_fatal 41016 1727204218.32960: checking for max_fail_percentage 41016 1727204218.32961: done checking for max_fail_percentage 41016 1727204218.32961: checking to see if all hosts have failed and the running result is not ok 41016 1727204218.32962: done checking to see if all hosts have failed 41016 1727204218.32962: getting the remaining hosts for this loop 41016 1727204218.32963: done getting the remaining hosts for this loop 41016 1727204218.32965: getting the next task for host managed-node1 41016 1727204218.32969: done getting next task for host managed-node1 41016 1727204218.32971: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 41016 1727204218.32974: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204218.32977: getting variables 41016 1727204218.32978: in VariableManager get_vars() 41016 1727204218.32993: Calling all_inventory to load vars for managed-node1 41016 1727204218.32996: Calling groups_inventory to load vars for managed-node1 41016 1727204218.32998: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204218.33003: Calling all_plugins_play to load vars for managed-node1 41016 1727204218.33006: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204218.33013: Calling groups_plugins_play to load vars for managed-node1 41016 1727204218.34002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204218.35252: done with get_vars() 41016 1727204218.35285: done getting variables 41016 1727204218.35322: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:56:58 -0400 (0:00:00.064) 0:00:42.029 ***** 41016 1727204218.35355: entering _queue_task() for managed-node1/set_fact 41016 1727204218.35655: worker is 1 (out of 1 available) 41016 1727204218.35668: exiting _queue_task() for managed-node1/set_fact 41016 1727204218.35681: done queuing things up, now waiting for results queue to drain 41016 1727204218.35682: waiting for pending results... 41016 1727204218.35935: running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag 41016 1727204218.36061: in run() - task 028d2410-947f-12d5-0ec4-000000000b79 41016 1727204218.36065: variable 'ansible_search_path' from source: unknown 41016 1727204218.36067: variable 'ansible_search_path' from source: unknown 41016 1727204218.36090: calling self._execute() 41016 1727204218.36207: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204218.36211: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204218.36214: variable 'omit' from source: magic vars 41016 1727204218.36584: variable 'ansible_distribution_major_version' from source: facts 41016 1727204218.36588: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204218.36591: variable 'omit' from source: magic vars 41016 1727204218.36651: variable 'omit' from source: magic vars 41016 1727204218.36694: variable 'omit' from source: magic vars 41016 1727204218.36764: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204218.36794: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204218.36825: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204218.36845: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204218.36848: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204218.36880: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204218.36884: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204218.36888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204218.36957: Set connection var ansible_shell_executable to /bin/sh 41016 1727204218.36960: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204218.36969: Set connection var ansible_shell_type to sh 41016 1727204218.36972: Set connection var ansible_timeout to 10 41016 1727204218.36984: Set connection var ansible_pipelining to False 41016 1727204218.36986: Set connection var ansible_connection to ssh 41016 1727204218.37002: variable 'ansible_shell_executable' from source: unknown 41016 1727204218.37005: variable 'ansible_connection' from source: unknown 41016 1727204218.37008: variable 'ansible_module_compression' from source: unknown 41016 1727204218.37010: variable 'ansible_shell_type' from source: unknown 41016 1727204218.37012: variable 'ansible_shell_executable' from source: unknown 41016 1727204218.37019: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204218.37021: variable 'ansible_pipelining' from source: unknown 41016 1727204218.37024: variable 'ansible_timeout' from source: unknown 41016 1727204218.37026: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204218.37170: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204218.37174: variable 'omit' from source: magic vars 41016 1727204218.37208: starting attempt loop 41016 1727204218.37211: running the handler 41016 1727204218.37214: handler run complete 41016 1727204218.37216: attempt loop complete, returning result 41016 1727204218.37218: _execute() done 41016 1727204218.37223: dumping result to json 41016 1727204218.37225: done dumping result, returning 41016 1727204218.37269: done running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag [028d2410-947f-12d5-0ec4-000000000b79] 41016 1727204218.37272: sending task result for task 028d2410-947f-12d5-0ec4-000000000b79 41016 1727204218.37337: done sending task result for task 028d2410-947f-12d5-0ec4-000000000b79 41016 1727204218.37340: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 41016 1727204218.37402: no more pending results, returning what we have 41016 1727204218.37406: results queue empty 41016 1727204218.37407: checking for any_errors_fatal 41016 1727204218.37408: done checking for any_errors_fatal 41016 1727204218.37408: checking for max_fail_percentage 41016 1727204218.37409: done checking for max_fail_percentage 41016 1727204218.37410: checking to see if all hosts have failed and the running result is not ok 41016 1727204218.37411: done checking to see if all hosts have failed 41016 1727204218.37412: getting the remaining hosts for this loop 41016 1727204218.37413: done getting the remaining hosts for this loop 41016 1727204218.37416: getting the next task for host managed-node1 41016 1727204218.37424: done getting next task for host managed-node1 41016 1727204218.37427: ^ task is: TASK: Stat profile file 41016 1727204218.37432: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204218.37435: getting variables 41016 1727204218.37437: in VariableManager get_vars() 41016 1727204218.37478: Calling all_inventory to load vars for managed-node1 41016 1727204218.37483: Calling groups_inventory to load vars for managed-node1 41016 1727204218.37486: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204218.37496: Calling all_plugins_play to load vars for managed-node1 41016 1727204218.37498: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204218.37500: Calling groups_plugins_play to load vars for managed-node1 41016 1727204218.38391: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204218.39284: done with get_vars() 41016 1727204218.39298: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:56:58 -0400 (0:00:00.040) 0:00:42.069 ***** 41016 1727204218.39362: entering _queue_task() for managed-node1/stat 41016 1727204218.39632: worker is 1 (out of 1 available) 41016 1727204218.39644: exiting _queue_task() for managed-node1/stat 41016 1727204218.39661: done queuing things up, now waiting for results queue to drain 41016 1727204218.39662: waiting for pending results... 41016 1727204218.39981: running TaskExecutor() for managed-node1/TASK: Stat profile file 41016 1727204218.40041: in run() - task 028d2410-947f-12d5-0ec4-000000000b7a 41016 1727204218.40051: variable 'ansible_search_path' from source: unknown 41016 1727204218.40056: variable 'ansible_search_path' from source: unknown 41016 1727204218.40122: calling self._execute() 41016 1727204218.40190: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204218.40195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204218.40203: variable 'omit' from source: magic vars 41016 1727204218.40507: variable 'ansible_distribution_major_version' from source: facts 41016 1727204218.40518: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204218.40525: variable 'omit' from source: magic vars 41016 1727204218.40556: variable 'omit' from source: magic vars 41016 1727204218.40646: variable 'profile' from source: include params 41016 1727204218.40650: variable 'item' from source: include params 41016 1727204218.40726: variable 'item' from source: include params 41016 1727204218.40744: variable 'omit' from source: magic vars 41016 1727204218.40777: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204218.40807: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204218.40821: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204218.40836: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204218.40845: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204218.40868: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204218.40870: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204218.40874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204218.40946: Set connection var ansible_shell_executable to /bin/sh 41016 1727204218.40951: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204218.40957: Set connection var ansible_shell_type to sh 41016 1727204218.40962: Set connection var ansible_timeout to 10 41016 1727204218.40967: Set connection var ansible_pipelining to False 41016 1727204218.40973: Set connection var ansible_connection to ssh 41016 1727204218.40993: variable 'ansible_shell_executable' from source: unknown 41016 1727204218.40996: variable 'ansible_connection' from source: unknown 41016 1727204218.40998: variable 'ansible_module_compression' from source: unknown 41016 1727204218.41001: variable 'ansible_shell_type' from source: unknown 41016 1727204218.41003: variable 'ansible_shell_executable' from source: unknown 41016 1727204218.41005: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204218.41012: variable 'ansible_pipelining' from source: unknown 41016 1727204218.41015: variable 'ansible_timeout' from source: unknown 41016 1727204218.41018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204218.41159: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 41016 1727204218.41169: variable 'omit' from source: magic vars 41016 1727204218.41176: starting attempt loop 41016 1727204218.41179: running the handler 41016 1727204218.41192: _low_level_execute_command(): starting 41016 1727204218.41198: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204218.41744: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204218.41748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204218.41751: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204218.41754: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204218.41802: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204218.41807: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204218.41809: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204218.41899: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204218.43692: stdout chunk (state=3): >>>/root <<< 41016 1727204218.43790: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204218.43824: stderr chunk (state=3): >>><<< 41016 1727204218.43827: stdout chunk (state=3): >>><<< 41016 1727204218.43846: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204218.43857: _low_level_execute_command(): starting 41016 1727204218.43862: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204218.4384568-43695-202672857928574 `" && echo ansible-tmp-1727204218.4384568-43695-202672857928574="` echo /root/.ansible/tmp/ansible-tmp-1727204218.4384568-43695-202672857928574 `" ) && sleep 0' 41016 1727204218.44401: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204218.44414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204218.44468: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204218.44485: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204218.44492: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204218.44573: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204218.46703: stdout chunk (state=3): >>>ansible-tmp-1727204218.4384568-43695-202672857928574=/root/.ansible/tmp/ansible-tmp-1727204218.4384568-43695-202672857928574 <<< 41016 1727204218.46798: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204218.46837: stderr chunk (state=3): >>><<< 41016 1727204218.46840: stdout chunk (state=3): >>><<< 41016 1727204218.46852: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204218.4384568-43695-202672857928574=/root/.ansible/tmp/ansible-tmp-1727204218.4384568-43695-202672857928574 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204218.46894: variable 'ansible_module_compression' from source: unknown 41016 1727204218.46939: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-410168h8uvyln/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 41016 1727204218.46966: variable 'ansible_facts' from source: unknown 41016 1727204218.47031: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204218.4384568-43695-202672857928574/AnsiballZ_stat.py 41016 1727204218.47127: Sending initial data 41016 1727204218.47130: Sent initial data (153 bytes) 41016 1727204218.47553: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204218.47560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204218.47585: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204218.47589: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204218.47600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204218.47647: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204218.47651: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204218.47668: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204218.47770: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204218.49533: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 41016 1727204218.49537: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204218.49625: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204218.49701: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmpgrrjvrxw /root/.ansible/tmp/ansible-tmp-1727204218.4384568-43695-202672857928574/AnsiballZ_stat.py <<< 41016 1727204218.49706: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204218.4384568-43695-202672857928574/AnsiballZ_stat.py" <<< 41016 1727204218.49777: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmpgrrjvrxw" to remote "/root/.ansible/tmp/ansible-tmp-1727204218.4384568-43695-202672857928574/AnsiballZ_stat.py" <<< 41016 1727204218.49781: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204218.4384568-43695-202672857928574/AnsiballZ_stat.py" <<< 41016 1727204218.50444: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204218.50482: stderr chunk (state=3): >>><<< 41016 1727204218.50486: stdout chunk (state=3): >>><<< 41016 1727204218.50522: done transferring module to remote 41016 1727204218.50530: _low_level_execute_command(): starting 41016 1727204218.50534: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204218.4384568-43695-202672857928574/ /root/.ansible/tmp/ansible-tmp-1727204218.4384568-43695-202672857928574/AnsiballZ_stat.py && sleep 0' 41016 1727204218.51136: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204218.51234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204218.51239: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204218.51252: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204218.51257: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204218.51303: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204218.51378: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204218.53364: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204218.53417: stderr chunk (state=3): >>><<< 41016 1727204218.53420: stdout chunk (state=3): >>><<< 41016 1727204218.53459: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204218.53462: _low_level_execute_command(): starting 41016 1727204218.53464: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204218.4384568-43695-202672857928574/AnsiballZ_stat.py && sleep 0' 41016 1727204218.54087: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204218.54090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204218.54092: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204218.54094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204218.54129: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204218.54134: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204218.54222: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204218.70856: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 41016 1727204218.72499: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 41016 1727204218.72504: stdout chunk (state=3): >>><<< 41016 1727204218.72506: stderr chunk (state=3): >>><<< 41016 1727204218.72689: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204218.72694: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-ethtest1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204218.4384568-43695-202672857928574/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204218.72697: _low_level_execute_command(): starting 41016 1727204218.72700: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204218.4384568-43695-202672857928574/ > /dev/null 2>&1 && sleep 0' 41016 1727204218.73696: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204218.73714: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204218.73817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204218.73884: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204218.74092: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204218.76142: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204218.76159: stdout chunk (state=3): >>><<< 41016 1727204218.76185: stderr chunk (state=3): >>><<< 41016 1727204218.76201: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204218.76212: handler run complete 41016 1727204218.76410: attempt loop complete, returning result 41016 1727204218.76413: _execute() done 41016 1727204218.76416: dumping result to json 41016 1727204218.76418: done dumping result, returning 41016 1727204218.76420: done running TaskExecutor() for managed-node1/TASK: Stat profile file [028d2410-947f-12d5-0ec4-000000000b7a] 41016 1727204218.76422: sending task result for task 028d2410-947f-12d5-0ec4-000000000b7a ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 41016 1727204218.76880: no more pending results, returning what we have 41016 1727204218.76885: results queue empty 41016 1727204218.76886: checking for any_errors_fatal 41016 1727204218.76895: done checking for any_errors_fatal 41016 1727204218.76896: checking for max_fail_percentage 41016 1727204218.76897: done checking for max_fail_percentage 41016 1727204218.76899: checking to see if all hosts have failed and the running result is not ok 41016 1727204218.76900: done checking to see if all hosts have failed 41016 1727204218.76901: getting the remaining hosts for this loop 41016 1727204218.76902: done getting the remaining hosts for this loop 41016 1727204218.76907: getting the next task for host managed-node1 41016 1727204218.76917: done getting next task for host managed-node1 41016 1727204218.76920: ^ task is: TASK: Set NM profile exist flag based on the profile files 41016 1727204218.76926: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204218.76930: getting variables 41016 1727204218.76932: in VariableManager get_vars() 41016 1727204218.76974: Calling all_inventory to load vars for managed-node1 41016 1727204218.77087: Calling groups_inventory to load vars for managed-node1 41016 1727204218.77091: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204218.77201: Calling all_plugins_play to load vars for managed-node1 41016 1727204218.77205: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204218.77208: Calling groups_plugins_play to load vars for managed-node1 41016 1727204218.77789: done sending task result for task 028d2410-947f-12d5-0ec4-000000000b7a 41016 1727204218.77792: WORKER PROCESS EXITING 41016 1727204218.78640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204218.80280: done with get_vars() 41016 1727204218.80308: done getting variables 41016 1727204218.80379: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:56:58 -0400 (0:00:00.410) 0:00:42.480 ***** 41016 1727204218.80414: entering _queue_task() for managed-node1/set_fact 41016 1727204218.80996: worker is 1 (out of 1 available) 41016 1727204218.81007: exiting _queue_task() for managed-node1/set_fact 41016 1727204218.81018: done queuing things up, now waiting for results queue to drain 41016 1727204218.81019: waiting for pending results... 41016 1727204218.81137: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files 41016 1727204218.81285: in run() - task 028d2410-947f-12d5-0ec4-000000000b7b 41016 1727204218.81305: variable 'ansible_search_path' from source: unknown 41016 1727204218.81313: variable 'ansible_search_path' from source: unknown 41016 1727204218.81362: calling self._execute() 41016 1727204218.81464: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204218.81576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204218.81581: variable 'omit' from source: magic vars 41016 1727204218.81879: variable 'ansible_distribution_major_version' from source: facts 41016 1727204218.81902: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204218.82037: variable 'profile_stat' from source: set_fact 41016 1727204218.82054: Evaluated conditional (profile_stat.stat.exists): False 41016 1727204218.82063: when evaluation is False, skipping this task 41016 1727204218.82070: _execute() done 41016 1727204218.82078: dumping result to json 41016 1727204218.82087: done dumping result, returning 41016 1727204218.82096: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files [028d2410-947f-12d5-0ec4-000000000b7b] 41016 1727204218.82105: sending task result for task 028d2410-947f-12d5-0ec4-000000000b7b skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 41016 1727204218.82388: no more pending results, returning what we have 41016 1727204218.82394: results queue empty 41016 1727204218.82395: checking for any_errors_fatal 41016 1727204218.82409: done checking for any_errors_fatal 41016 1727204218.82410: checking for max_fail_percentage 41016 1727204218.82411: done checking for max_fail_percentage 41016 1727204218.82412: checking to see if all hosts have failed and the running result is not ok 41016 1727204218.82413: done checking to see if all hosts have failed 41016 1727204218.82414: getting the remaining hosts for this loop 41016 1727204218.82416: done getting the remaining hosts for this loop 41016 1727204218.82420: getting the next task for host managed-node1 41016 1727204218.82430: done getting next task for host managed-node1 41016 1727204218.82433: ^ task is: TASK: Get NM profile info 41016 1727204218.82447: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204218.82454: getting variables 41016 1727204218.82456: in VariableManager get_vars() 41016 1727204218.82503: Calling all_inventory to load vars for managed-node1 41016 1727204218.82506: Calling groups_inventory to load vars for managed-node1 41016 1727204218.82509: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204218.82671: Calling all_plugins_play to load vars for managed-node1 41016 1727204218.82675: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204218.82680: Calling groups_plugins_play to load vars for managed-node1 41016 1727204218.83283: done sending task result for task 028d2410-947f-12d5-0ec4-000000000b7b 41016 1727204218.83287: WORKER PROCESS EXITING 41016 1727204218.84224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204218.85846: done with get_vars() 41016 1727204218.85870: done getting variables 41016 1727204218.85947: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:56:58 -0400 (0:00:00.055) 0:00:42.535 ***** 41016 1727204218.85984: entering _queue_task() for managed-node1/shell 41016 1727204218.86473: worker is 1 (out of 1 available) 41016 1727204218.86486: exiting _queue_task() for managed-node1/shell 41016 1727204218.86497: done queuing things up, now waiting for results queue to drain 41016 1727204218.86499: waiting for pending results... 41016 1727204218.86705: running TaskExecutor() for managed-node1/TASK: Get NM profile info 41016 1727204218.86833: in run() - task 028d2410-947f-12d5-0ec4-000000000b7c 41016 1727204218.86852: variable 'ansible_search_path' from source: unknown 41016 1727204218.86858: variable 'ansible_search_path' from source: unknown 41016 1727204218.86903: calling self._execute() 41016 1727204218.87011: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204218.87025: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204218.87039: variable 'omit' from source: magic vars 41016 1727204218.87437: variable 'ansible_distribution_major_version' from source: facts 41016 1727204218.87458: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204218.87469: variable 'omit' from source: magic vars 41016 1727204218.87525: variable 'omit' from source: magic vars 41016 1727204218.87638: variable 'profile' from source: include params 41016 1727204218.87649: variable 'item' from source: include params 41016 1727204218.87722: variable 'item' from source: include params 41016 1727204218.87746: variable 'omit' from source: magic vars 41016 1727204218.87806: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204218.87846: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204218.87868: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204218.87981: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204218.87986: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204218.87988: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204218.87990: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204218.87992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204218.88071: Set connection var ansible_shell_executable to /bin/sh 41016 1727204218.88085: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204218.88094: Set connection var ansible_shell_type to sh 41016 1727204218.88112: Set connection var ansible_timeout to 10 41016 1727204218.88122: Set connection var ansible_pipelining to False 41016 1727204218.88218: Set connection var ansible_connection to ssh 41016 1727204218.88221: variable 'ansible_shell_executable' from source: unknown 41016 1727204218.88223: variable 'ansible_connection' from source: unknown 41016 1727204218.88225: variable 'ansible_module_compression' from source: unknown 41016 1727204218.88227: variable 'ansible_shell_type' from source: unknown 41016 1727204218.88229: variable 'ansible_shell_executable' from source: unknown 41016 1727204218.88231: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204218.88233: variable 'ansible_pipelining' from source: unknown 41016 1727204218.88235: variable 'ansible_timeout' from source: unknown 41016 1727204218.88238: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204218.88342: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204218.88363: variable 'omit' from source: magic vars 41016 1727204218.88373: starting attempt loop 41016 1727204218.88381: running the handler 41016 1727204218.88396: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204218.88435: _low_level_execute_command(): starting 41016 1727204218.88438: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204218.89197: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204218.89294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204218.89300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204218.89342: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204218.89364: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204218.89406: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204218.89483: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204218.91288: stdout chunk (state=3): >>>/root <<< 41016 1727204218.91439: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204218.91442: stdout chunk (state=3): >>><<< 41016 1727204218.91445: stderr chunk (state=3): >>><<< 41016 1727204218.91469: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204218.91497: _low_level_execute_command(): starting 41016 1727204218.91589: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204218.9148204-43715-254130829480330 `" && echo ansible-tmp-1727204218.9148204-43715-254130829480330="` echo /root/.ansible/tmp/ansible-tmp-1727204218.9148204-43715-254130829480330 `" ) && sleep 0' 41016 1727204218.92130: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204218.92155: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204218.92169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204218.92193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204218.92213: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204218.92225: stderr chunk (state=3): >>>debug2: match not found <<< 41016 1727204218.92239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204218.92292: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204218.92338: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204218.92355: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204218.92373: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204218.92489: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204218.94630: stdout chunk (state=3): >>>ansible-tmp-1727204218.9148204-43715-254130829480330=/root/.ansible/tmp/ansible-tmp-1727204218.9148204-43715-254130829480330 <<< 41016 1727204218.95052: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204218.95055: stdout chunk (state=3): >>><<< 41016 1727204218.95057: stderr chunk (state=3): >>><<< 41016 1727204218.95059: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204218.9148204-43715-254130829480330=/root/.ansible/tmp/ansible-tmp-1727204218.9148204-43715-254130829480330 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204218.95062: variable 'ansible_module_compression' from source: unknown 41016 1727204218.95064: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-410168h8uvyln/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41016 1727204218.95066: variable 'ansible_facts' from source: unknown 41016 1727204218.95224: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204218.9148204-43715-254130829480330/AnsiballZ_command.py 41016 1727204218.95607: Sending initial data 41016 1727204218.95610: Sent initial data (156 bytes) 41016 1727204218.95936: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204218.95954: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204218.95967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204218.95993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204218.96009: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204218.96020: stderr chunk (state=3): >>>debug2: match not found <<< 41016 1727204218.96033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204218.96058: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41016 1727204218.96147: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204218.96179: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204218.96285: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204218.98050: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204218.98151: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204218.98341: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmpkvn4vrst /root/.ansible/tmp/ansible-tmp-1727204218.9148204-43715-254130829480330/AnsiballZ_command.py <<< 41016 1727204218.98424: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204218.9148204-43715-254130829480330/AnsiballZ_command.py" <<< 41016 1727204218.98457: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmpkvn4vrst" to remote "/root/.ansible/tmp/ansible-tmp-1727204218.9148204-43715-254130829480330/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204218.9148204-43715-254130829480330/AnsiballZ_command.py" <<< 41016 1727204218.99598: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204218.99758: stderr chunk (state=3): >>><<< 41016 1727204218.99798: stdout chunk (state=3): >>><<< 41016 1727204219.00033: done transferring module to remote 41016 1727204219.00037: _low_level_execute_command(): starting 41016 1727204219.00039: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204218.9148204-43715-254130829480330/ /root/.ansible/tmp/ansible-tmp-1727204218.9148204-43715-254130829480330/AnsiballZ_command.py && sleep 0' 41016 1727204219.01741: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204219.01802: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204219.01821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204219.01863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204219.01900: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204219.02007: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204219.02030: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204219.02148: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204219.04109: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204219.04170: stderr chunk (state=3): >>><<< 41016 1727204219.04173: stdout chunk (state=3): >>><<< 41016 1727204219.04198: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204219.04201: _low_level_execute_command(): starting 41016 1727204219.04213: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204218.9148204-43715-254130829480330/AnsiballZ_command.py && sleep 0' 41016 1727204219.05048: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204219.05051: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204219.05130: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204219.23672: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc", "start": "2024-09-24 14:56:59.215148", "end": "2024-09-24 14:56:59.232304", "delta": "0:00:00.017156", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41016 1727204219.25421: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.14.47 closed. <<< 41016 1727204219.25689: stderr chunk (state=3): >>><<< 41016 1727204219.25693: stdout chunk (state=3): >>><<< 41016 1727204219.25696: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc", "start": "2024-09-24 14:56:59.215148", "end": "2024-09-24 14:56:59.232304", "delta": "0:00:00.017156", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.14.47 closed. 41016 1727204219.25699: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204218.9148204-43715-254130829480330/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204219.25706: _low_level_execute_command(): starting 41016 1727204219.25708: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204218.9148204-43715-254130829480330/ > /dev/null 2>&1 && sleep 0' 41016 1727204219.26783: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204219.26908: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204219.26936: stderr chunk (state=3): >>>debug2: match not found <<< 41016 1727204219.26939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204219.26987: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41016 1727204219.26990: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 41016 1727204219.26993: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41016 1727204219.26995: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204219.26997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204219.26999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204219.27002: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204219.27030: stderr chunk (state=3): >>>debug2: match found <<< 41016 1727204219.27033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204219.27249: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204219.27253: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204219.27255: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204219.27270: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204219.29284: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204219.29288: stdout chunk (state=3): >>><<< 41016 1727204219.29290: stderr chunk (state=3): >>><<< 41016 1727204219.29307: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204219.29346: handler run complete 41016 1727204219.29378: Evaluated conditional (False): False 41016 1727204219.29448: attempt loop complete, returning result 41016 1727204219.29462: _execute() done 41016 1727204219.29471: dumping result to json 41016 1727204219.29670: done dumping result, returning 41016 1727204219.29673: done running TaskExecutor() for managed-node1/TASK: Get NM profile info [028d2410-947f-12d5-0ec4-000000000b7c] 41016 1727204219.29678: sending task result for task 028d2410-947f-12d5-0ec4-000000000b7c 41016 1727204219.29753: done sending task result for task 028d2410-947f-12d5-0ec4-000000000b7c 41016 1727204219.29757: WORKER PROCESS EXITING fatal: [managed-node1]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc", "delta": "0:00:00.017156", "end": "2024-09-24 14:56:59.232304", "rc": 1, "start": "2024-09-24 14:56:59.215148" } MSG: non-zero return code ...ignoring 41016 1727204219.29850: no more pending results, returning what we have 41016 1727204219.29854: results queue empty 41016 1727204219.29855: checking for any_errors_fatal 41016 1727204219.29861: done checking for any_errors_fatal 41016 1727204219.29862: checking for max_fail_percentage 41016 1727204219.29864: done checking for max_fail_percentage 41016 1727204219.29865: checking to see if all hosts have failed and the running result is not ok 41016 1727204219.29866: done checking to see if all hosts have failed 41016 1727204219.29866: getting the remaining hosts for this loop 41016 1727204219.29868: done getting the remaining hosts for this loop 41016 1727204219.29871: getting the next task for host managed-node1 41016 1727204219.29984: done getting next task for host managed-node1 41016 1727204219.29995: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 41016 1727204219.30001: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204219.30007: getting variables 41016 1727204219.30009: in VariableManager get_vars() 41016 1727204219.30056: Calling all_inventory to load vars for managed-node1 41016 1727204219.30060: Calling groups_inventory to load vars for managed-node1 41016 1727204219.30062: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204219.30074: Calling all_plugins_play to load vars for managed-node1 41016 1727204219.30302: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204219.30308: Calling groups_plugins_play to load vars for managed-node1 41016 1727204219.33638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204219.52128: done with get_vars() 41016 1727204219.52271: done getting variables 41016 1727204219.52325: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:56:59 -0400 (0:00:00.663) 0:00:43.199 ***** 41016 1727204219.52360: entering _queue_task() for managed-node1/set_fact 41016 1727204219.53243: worker is 1 (out of 1 available) 41016 1727204219.53254: exiting _queue_task() for managed-node1/set_fact 41016 1727204219.53264: done queuing things up, now waiting for results queue to drain 41016 1727204219.53266: waiting for pending results... 41016 1727204219.53799: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 41016 1727204219.54040: in run() - task 028d2410-947f-12d5-0ec4-000000000b7d 41016 1727204219.54115: variable 'ansible_search_path' from source: unknown 41016 1727204219.54125: variable 'ansible_search_path' from source: unknown 41016 1727204219.54309: calling self._execute() 41016 1727204219.54429: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204219.54447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204219.54656: variable 'omit' from source: magic vars 41016 1727204219.55526: variable 'ansible_distribution_major_version' from source: facts 41016 1727204219.55531: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204219.55848: variable 'nm_profile_exists' from source: set_fact 41016 1727204219.55852: Evaluated conditional (nm_profile_exists.rc == 0): False 41016 1727204219.55857: when evaluation is False, skipping this task 41016 1727204219.55860: _execute() done 41016 1727204219.55863: dumping result to json 41016 1727204219.55865: done dumping result, returning 41016 1727204219.55868: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [028d2410-947f-12d5-0ec4-000000000b7d] 41016 1727204219.55872: sending task result for task 028d2410-947f-12d5-0ec4-000000000b7d 41016 1727204219.55945: done sending task result for task 028d2410-947f-12d5-0ec4-000000000b7d skipping: [managed-node1] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 41016 1727204219.56004: no more pending results, returning what we have 41016 1727204219.56009: results queue empty 41016 1727204219.56012: checking for any_errors_fatal 41016 1727204219.56024: done checking for any_errors_fatal 41016 1727204219.56025: checking for max_fail_percentage 41016 1727204219.56027: done checking for max_fail_percentage 41016 1727204219.56028: checking to see if all hosts have failed and the running result is not ok 41016 1727204219.56029: done checking to see if all hosts have failed 41016 1727204219.56029: getting the remaining hosts for this loop 41016 1727204219.56031: done getting the remaining hosts for this loop 41016 1727204219.56035: getting the next task for host managed-node1 41016 1727204219.56044: done getting next task for host managed-node1 41016 1727204219.56047: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 41016 1727204219.56052: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204219.56063: getting variables 41016 1727204219.56065: in VariableManager get_vars() 41016 1727204219.56115: Calling all_inventory to load vars for managed-node1 41016 1727204219.56118: Calling groups_inventory to load vars for managed-node1 41016 1727204219.56121: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204219.56135: Calling all_plugins_play to load vars for managed-node1 41016 1727204219.56138: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204219.56141: Calling groups_plugins_play to load vars for managed-node1 41016 1727204219.57226: WORKER PROCESS EXITING 41016 1727204219.58380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204219.60480: done with get_vars() 41016 1727204219.60507: done getting variables 41016 1727204219.60571: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41016 1727204219.60757: variable 'profile' from source: include params 41016 1727204219.60761: variable 'item' from source: include params 41016 1727204219.60826: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-ethtest1] *********************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:56:59 -0400 (0:00:00.084) 0:00:43.284 ***** 41016 1727204219.60861: entering _queue_task() for managed-node1/command 41016 1727204219.61207: worker is 1 (out of 1 available) 41016 1727204219.61220: exiting _queue_task() for managed-node1/command 41016 1727204219.61232: done queuing things up, now waiting for results queue to drain 41016 1727204219.61233: waiting for pending results... 41016 1727204219.61541: running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-ethtest1 41016 1727204219.61659: in run() - task 028d2410-947f-12d5-0ec4-000000000b7f 41016 1727204219.61671: variable 'ansible_search_path' from source: unknown 41016 1727204219.61677: variable 'ansible_search_path' from source: unknown 41016 1727204219.61746: calling self._execute() 41016 1727204219.61818: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204219.61826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204219.61853: variable 'omit' from source: magic vars 41016 1727204219.62227: variable 'ansible_distribution_major_version' from source: facts 41016 1727204219.62236: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204219.62480: variable 'profile_stat' from source: set_fact 41016 1727204219.62483: Evaluated conditional (profile_stat.stat.exists): False 41016 1727204219.62485: when evaluation is False, skipping this task 41016 1727204219.62487: _execute() done 41016 1727204219.62489: dumping result to json 41016 1727204219.62491: done dumping result, returning 41016 1727204219.62493: done running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-ethtest1 [028d2410-947f-12d5-0ec4-000000000b7f] 41016 1727204219.62495: sending task result for task 028d2410-947f-12d5-0ec4-000000000b7f 41016 1727204219.62556: done sending task result for task 028d2410-947f-12d5-0ec4-000000000b7f 41016 1727204219.62559: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 41016 1727204219.62622: no more pending results, returning what we have 41016 1727204219.62626: results queue empty 41016 1727204219.62628: checking for any_errors_fatal 41016 1727204219.62634: done checking for any_errors_fatal 41016 1727204219.62635: checking for max_fail_percentage 41016 1727204219.62636: done checking for max_fail_percentage 41016 1727204219.62638: checking to see if all hosts have failed and the running result is not ok 41016 1727204219.62638: done checking to see if all hosts have failed 41016 1727204219.62639: getting the remaining hosts for this loop 41016 1727204219.62641: done getting the remaining hosts for this loop 41016 1727204219.62646: getting the next task for host managed-node1 41016 1727204219.62656: done getting next task for host managed-node1 41016 1727204219.62659: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 41016 1727204219.62665: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204219.62668: getting variables 41016 1727204219.62670: in VariableManager get_vars() 41016 1727204219.62721: Calling all_inventory to load vars for managed-node1 41016 1727204219.62725: Calling groups_inventory to load vars for managed-node1 41016 1727204219.62727: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204219.62740: Calling all_plugins_play to load vars for managed-node1 41016 1727204219.62743: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204219.62745: Calling groups_plugins_play to load vars for managed-node1 41016 1727204219.64608: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204219.66286: done with get_vars() 41016 1727204219.66310: done getting variables 41016 1727204219.66373: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41016 1727204219.66494: variable 'profile' from source: include params 41016 1727204219.66498: variable 'item' from source: include params 41016 1727204219.66557: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-ethtest1] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:56:59 -0400 (0:00:00.057) 0:00:43.342 ***** 41016 1727204219.66594: entering _queue_task() for managed-node1/set_fact 41016 1727204219.66949: worker is 1 (out of 1 available) 41016 1727204219.66961: exiting _queue_task() for managed-node1/set_fact 41016 1727204219.67178: done queuing things up, now waiting for results queue to drain 41016 1727204219.67180: waiting for pending results... 41016 1727204219.67680: running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-ethtest1 41016 1727204219.68022: in run() - task 028d2410-947f-12d5-0ec4-000000000b80 41016 1727204219.68026: variable 'ansible_search_path' from source: unknown 41016 1727204219.68029: variable 'ansible_search_path' from source: unknown 41016 1727204219.68032: calling self._execute() 41016 1727204219.68500: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204219.68504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204219.68508: variable 'omit' from source: magic vars 41016 1727204219.69828: variable 'ansible_distribution_major_version' from source: facts 41016 1727204219.69900: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204219.70252: variable 'profile_stat' from source: set_fact 41016 1727204219.70349: Evaluated conditional (profile_stat.stat.exists): False 41016 1727204219.70359: when evaluation is False, skipping this task 41016 1727204219.70368: _execute() done 41016 1727204219.70378: dumping result to json 41016 1727204219.70390: done dumping result, returning 41016 1727204219.70452: done running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-ethtest1 [028d2410-947f-12d5-0ec4-000000000b80] 41016 1727204219.70464: sending task result for task 028d2410-947f-12d5-0ec4-000000000b80 41016 1727204219.70947: done sending task result for task 028d2410-947f-12d5-0ec4-000000000b80 41016 1727204219.70951: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 41016 1727204219.71024: no more pending results, returning what we have 41016 1727204219.71029: results queue empty 41016 1727204219.71030: checking for any_errors_fatal 41016 1727204219.71037: done checking for any_errors_fatal 41016 1727204219.71037: checking for max_fail_percentage 41016 1727204219.71039: done checking for max_fail_percentage 41016 1727204219.71040: checking to see if all hosts have failed and the running result is not ok 41016 1727204219.71041: done checking to see if all hosts have failed 41016 1727204219.71041: getting the remaining hosts for this loop 41016 1727204219.71043: done getting the remaining hosts for this loop 41016 1727204219.71047: getting the next task for host managed-node1 41016 1727204219.71055: done getting next task for host managed-node1 41016 1727204219.71057: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 41016 1727204219.71062: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204219.71066: getting variables 41016 1727204219.71067: in VariableManager get_vars() 41016 1727204219.71118: Calling all_inventory to load vars for managed-node1 41016 1727204219.71122: Calling groups_inventory to load vars for managed-node1 41016 1727204219.71125: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204219.71138: Calling all_plugins_play to load vars for managed-node1 41016 1727204219.71142: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204219.71145: Calling groups_plugins_play to load vars for managed-node1 41016 1727204219.72600: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204219.75724: done with get_vars() 41016 1727204219.75757: done getting variables 41016 1727204219.75833: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41016 1727204219.76666: variable 'profile' from source: include params 41016 1727204219.76671: variable 'item' from source: include params 41016 1727204219.76745: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-ethtest1] *************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:56:59 -0400 (0:00:00.102) 0:00:43.445 ***** 41016 1727204219.76895: entering _queue_task() for managed-node1/command 41016 1727204219.77996: worker is 1 (out of 1 available) 41016 1727204219.78013: exiting _queue_task() for managed-node1/command 41016 1727204219.78028: done queuing things up, now waiting for results queue to drain 41016 1727204219.78030: waiting for pending results... 41016 1727204219.78232: running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-ethtest1 41016 1727204219.78639: in run() - task 028d2410-947f-12d5-0ec4-000000000b81 41016 1727204219.78652: variable 'ansible_search_path' from source: unknown 41016 1727204219.78655: variable 'ansible_search_path' from source: unknown 41016 1727204219.78779: calling self._execute() 41016 1727204219.79027: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204219.79032: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204219.79042: variable 'omit' from source: magic vars 41016 1727204219.79752: variable 'ansible_distribution_major_version' from source: facts 41016 1727204219.79756: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204219.79834: variable 'profile_stat' from source: set_fact 41016 1727204219.79850: Evaluated conditional (profile_stat.stat.exists): False 41016 1727204219.79854: when evaluation is False, skipping this task 41016 1727204219.79857: _execute() done 41016 1727204219.79859: dumping result to json 41016 1727204219.79862: done dumping result, returning 41016 1727204219.79874: done running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-ethtest1 [028d2410-947f-12d5-0ec4-000000000b81] 41016 1727204219.79879: sending task result for task 028d2410-947f-12d5-0ec4-000000000b81 41016 1727204219.80055: done sending task result for task 028d2410-947f-12d5-0ec4-000000000b81 41016 1727204219.80058: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 41016 1727204219.80116: no more pending results, returning what we have 41016 1727204219.80121: results queue empty 41016 1727204219.80122: checking for any_errors_fatal 41016 1727204219.80129: done checking for any_errors_fatal 41016 1727204219.80129: checking for max_fail_percentage 41016 1727204219.80131: done checking for max_fail_percentage 41016 1727204219.80132: checking to see if all hosts have failed and the running result is not ok 41016 1727204219.80133: done checking to see if all hosts have failed 41016 1727204219.80134: getting the remaining hosts for this loop 41016 1727204219.80140: done getting the remaining hosts for this loop 41016 1727204219.80144: getting the next task for host managed-node1 41016 1727204219.80152: done getting next task for host managed-node1 41016 1727204219.80155: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 41016 1727204219.80161: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204219.80164: getting variables 41016 1727204219.80166: in VariableManager get_vars() 41016 1727204219.80217: Calling all_inventory to load vars for managed-node1 41016 1727204219.80220: Calling groups_inventory to load vars for managed-node1 41016 1727204219.80223: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204219.80236: Calling all_plugins_play to load vars for managed-node1 41016 1727204219.80240: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204219.80243: Calling groups_plugins_play to load vars for managed-node1 41016 1727204219.83666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204219.87112: done with get_vars() 41016 1727204219.87140: done getting variables 41016 1727204219.87209: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41016 1727204219.87549: variable 'profile' from source: include params 41016 1727204219.87554: variable 'item' from source: include params 41016 1727204219.87624: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-ethtest1] ************************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:56:59 -0400 (0:00:00.107) 0:00:43.552 ***** 41016 1727204219.87659: entering _queue_task() for managed-node1/set_fact 41016 1727204219.88451: worker is 1 (out of 1 available) 41016 1727204219.88464: exiting _queue_task() for managed-node1/set_fact 41016 1727204219.88879: done queuing things up, now waiting for results queue to drain 41016 1727204219.88881: waiting for pending results... 41016 1727204219.89380: running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-ethtest1 41016 1727204219.89463: in run() - task 028d2410-947f-12d5-0ec4-000000000b82 41016 1727204219.89483: variable 'ansible_search_path' from source: unknown 41016 1727204219.89487: variable 'ansible_search_path' from source: unknown 41016 1727204219.89535: calling self._execute() 41016 1727204219.89951: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204219.89955: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204219.89957: variable 'omit' from source: magic vars 41016 1727204219.90812: variable 'ansible_distribution_major_version' from source: facts 41016 1727204219.90831: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204219.91092: variable 'profile_stat' from source: set_fact 41016 1727204219.91260: Evaluated conditional (profile_stat.stat.exists): False 41016 1727204219.91264: when evaluation is False, skipping this task 41016 1727204219.91267: _execute() done 41016 1727204219.91272: dumping result to json 41016 1727204219.91277: done dumping result, returning 41016 1727204219.91279: done running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-ethtest1 [028d2410-947f-12d5-0ec4-000000000b82] 41016 1727204219.91281: sending task result for task 028d2410-947f-12d5-0ec4-000000000b82 41016 1727204219.91539: done sending task result for task 028d2410-947f-12d5-0ec4-000000000b82 41016 1727204219.91543: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 41016 1727204219.91735: no more pending results, returning what we have 41016 1727204219.91740: results queue empty 41016 1727204219.91741: checking for any_errors_fatal 41016 1727204219.91748: done checking for any_errors_fatal 41016 1727204219.91749: checking for max_fail_percentage 41016 1727204219.91751: done checking for max_fail_percentage 41016 1727204219.91752: checking to see if all hosts have failed and the running result is not ok 41016 1727204219.91753: done checking to see if all hosts have failed 41016 1727204219.91754: getting the remaining hosts for this loop 41016 1727204219.91756: done getting the remaining hosts for this loop 41016 1727204219.91762: getting the next task for host managed-node1 41016 1727204219.91771: done getting next task for host managed-node1 41016 1727204219.91777: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 41016 1727204219.91782: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204219.91787: getting variables 41016 1727204219.91789: in VariableManager get_vars() 41016 1727204219.91853: Calling all_inventory to load vars for managed-node1 41016 1727204219.91856: Calling groups_inventory to load vars for managed-node1 41016 1727204219.91859: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204219.91873: Calling all_plugins_play to load vars for managed-node1 41016 1727204219.92239: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204219.92245: Calling groups_plugins_play to load vars for managed-node1 41016 1727204219.94869: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204219.98324: done with get_vars() 41016 1727204219.98355: done getting variables 41016 1727204219.98421: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 41016 1727204219.98751: variable 'profile' from source: include params 41016 1727204219.98755: variable 'item' from source: include params 41016 1727204219.98812: variable 'item' from source: include params TASK [Assert that the profile is absent - 'ethtest1'] ************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Tuesday 24 September 2024 14:56:59 -0400 (0:00:00.111) 0:00:43.664 ***** 41016 1727204219.98845: entering _queue_task() for managed-node1/assert 41016 1727204219.99820: worker is 1 (out of 1 available) 41016 1727204219.99830: exiting _queue_task() for managed-node1/assert 41016 1727204219.99841: done queuing things up, now waiting for results queue to drain 41016 1727204219.99842: waiting for pending results... 41016 1727204220.00495: running TaskExecutor() for managed-node1/TASK: Assert that the profile is absent - 'ethtest1' 41016 1727204220.00594: in run() - task 028d2410-947f-12d5-0ec4-000000000a72 41016 1727204220.00599: variable 'ansible_search_path' from source: unknown 41016 1727204220.00601: variable 'ansible_search_path' from source: unknown 41016 1727204220.00604: calling self._execute() 41016 1727204220.00918: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204220.00925: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204220.00930: variable 'omit' from source: magic vars 41016 1727204220.01981: variable 'ansible_distribution_major_version' from source: facts 41016 1727204220.01985: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204220.01987: variable 'omit' from source: magic vars 41016 1727204220.02099: variable 'omit' from source: magic vars 41016 1727204220.02354: variable 'profile' from source: include params 41016 1727204220.02365: variable 'item' from source: include params 41016 1727204220.02596: variable 'item' from source: include params 41016 1727204220.02599: variable 'omit' from source: magic vars 41016 1727204220.02601: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204220.02691: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204220.02695: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204220.02697: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204220.02700: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204220.02817: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204220.02820: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204220.02823: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204220.02896: Set connection var ansible_shell_executable to /bin/sh 41016 1727204220.02899: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204220.02906: Set connection var ansible_shell_type to sh 41016 1727204220.02908: Set connection var ansible_timeout to 10 41016 1727204220.02910: Set connection var ansible_pipelining to False 41016 1727204220.02913: Set connection var ansible_connection to ssh 41016 1727204220.02980: variable 'ansible_shell_executable' from source: unknown 41016 1727204220.02984: variable 'ansible_connection' from source: unknown 41016 1727204220.02986: variable 'ansible_module_compression' from source: unknown 41016 1727204220.02992: variable 'ansible_shell_type' from source: unknown 41016 1727204220.02994: variable 'ansible_shell_executable' from source: unknown 41016 1727204220.02996: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204220.02998: variable 'ansible_pipelining' from source: unknown 41016 1727204220.03139: variable 'ansible_timeout' from source: unknown 41016 1727204220.03142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204220.03203: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204220.03218: variable 'omit' from source: magic vars 41016 1727204220.03224: starting attempt loop 41016 1727204220.03227: running the handler 41016 1727204220.03434: variable 'lsr_net_profile_exists' from source: set_fact 41016 1727204220.03437: Evaluated conditional (not lsr_net_profile_exists): True 41016 1727204220.03444: handler run complete 41016 1727204220.03458: attempt loop complete, returning result 41016 1727204220.03461: _execute() done 41016 1727204220.03464: dumping result to json 41016 1727204220.03466: done dumping result, returning 41016 1727204220.03779: done running TaskExecutor() for managed-node1/TASK: Assert that the profile is absent - 'ethtest1' [028d2410-947f-12d5-0ec4-000000000a72] 41016 1727204220.03782: sending task result for task 028d2410-947f-12d5-0ec4-000000000a72 41016 1727204220.03843: done sending task result for task 028d2410-947f-12d5-0ec4-000000000a72 41016 1727204220.03846: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 41016 1727204220.03895: no more pending results, returning what we have 41016 1727204220.03899: results queue empty 41016 1727204220.03900: checking for any_errors_fatal 41016 1727204220.03905: done checking for any_errors_fatal 41016 1727204220.03906: checking for max_fail_percentage 41016 1727204220.03907: done checking for max_fail_percentage 41016 1727204220.03908: checking to see if all hosts have failed and the running result is not ok 41016 1727204220.03909: done checking to see if all hosts have failed 41016 1727204220.03912: getting the remaining hosts for this loop 41016 1727204220.03914: done getting the remaining hosts for this loop 41016 1727204220.03918: getting the next task for host managed-node1 41016 1727204220.03927: done getting next task for host managed-node1 41016 1727204220.03931: ^ task is: TASK: Verify network state restored to default 41016 1727204220.03934: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204220.03938: getting variables 41016 1727204220.03940: in VariableManager get_vars() 41016 1727204220.04081: Calling all_inventory to load vars for managed-node1 41016 1727204220.04084: Calling groups_inventory to load vars for managed-node1 41016 1727204220.04087: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204220.04097: Calling all_plugins_play to load vars for managed-node1 41016 1727204220.04101: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204220.04104: Calling groups_plugins_play to load vars for managed-node1 41016 1727204220.06733: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204220.09424: done with get_vars() 41016 1727204220.09451: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:169 Tuesday 24 September 2024 14:57:00 -0400 (0:00:00.108) 0:00:43.772 ***** 41016 1727204220.09668: entering _queue_task() for managed-node1/include_tasks 41016 1727204220.10150: worker is 1 (out of 1 available) 41016 1727204220.10161: exiting _queue_task() for managed-node1/include_tasks 41016 1727204220.10174: done queuing things up, now waiting for results queue to drain 41016 1727204220.10178: waiting for pending results... 41016 1727204220.10884: running TaskExecutor() for managed-node1/TASK: Verify network state restored to default 41016 1727204220.10985: in run() - task 028d2410-947f-12d5-0ec4-0000000000bb 41016 1727204220.11008: variable 'ansible_search_path' from source: unknown 41016 1727204220.11048: calling self._execute() 41016 1727204220.11412: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204220.11418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204220.11422: variable 'omit' from source: magic vars 41016 1727204220.12146: variable 'ansible_distribution_major_version' from source: facts 41016 1727204220.12149: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204220.12151: _execute() done 41016 1727204220.12153: dumping result to json 41016 1727204220.12155: done dumping result, returning 41016 1727204220.12157: done running TaskExecutor() for managed-node1/TASK: Verify network state restored to default [028d2410-947f-12d5-0ec4-0000000000bb] 41016 1727204220.12158: sending task result for task 028d2410-947f-12d5-0ec4-0000000000bb 41016 1727204220.12346: done sending task result for task 028d2410-947f-12d5-0ec4-0000000000bb 41016 1727204220.12349: WORKER PROCESS EXITING 41016 1727204220.12377: no more pending results, returning what we have 41016 1727204220.12508: in VariableManager get_vars() 41016 1727204220.12553: Calling all_inventory to load vars for managed-node1 41016 1727204220.12556: Calling groups_inventory to load vars for managed-node1 41016 1727204220.12558: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204220.12568: Calling all_plugins_play to load vars for managed-node1 41016 1727204220.12571: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204220.12574: Calling groups_plugins_play to load vars for managed-node1 41016 1727204220.14559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204220.17234: done with get_vars() 41016 1727204220.17253: variable 'ansible_search_path' from source: unknown 41016 1727204220.17272: we have included files to process 41016 1727204220.17273: generating all_blocks data 41016 1727204220.17278: done generating all_blocks data 41016 1727204220.17284: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 41016 1727204220.17285: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 41016 1727204220.17288: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 41016 1727204220.17709: done processing included file 41016 1727204220.17714: iterating over new_blocks loaded from include file 41016 1727204220.17716: in VariableManager get_vars() 41016 1727204220.17733: done with get_vars() 41016 1727204220.17735: filtering new block on tags 41016 1727204220.17768: done filtering new block on tags 41016 1727204220.17771: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed-node1 41016 1727204220.17778: extending task lists for all hosts with included blocks 41016 1727204220.20695: done extending task lists 41016 1727204220.20697: done processing included files 41016 1727204220.20698: results queue empty 41016 1727204220.20699: checking for any_errors_fatal 41016 1727204220.20702: done checking for any_errors_fatal 41016 1727204220.20704: checking for max_fail_percentage 41016 1727204220.20705: done checking for max_fail_percentage 41016 1727204220.20706: checking to see if all hosts have failed and the running result is not ok 41016 1727204220.20706: done checking to see if all hosts have failed 41016 1727204220.20707: getting the remaining hosts for this loop 41016 1727204220.20708: done getting the remaining hosts for this loop 41016 1727204220.20714: getting the next task for host managed-node1 41016 1727204220.20719: done getting next task for host managed-node1 41016 1727204220.20721: ^ task is: TASK: Check routes and DNS 41016 1727204220.20724: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204220.20727: getting variables 41016 1727204220.20728: in VariableManager get_vars() 41016 1727204220.20748: Calling all_inventory to load vars for managed-node1 41016 1727204220.20751: Calling groups_inventory to load vars for managed-node1 41016 1727204220.20753: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204220.20760: Calling all_plugins_play to load vars for managed-node1 41016 1727204220.20762: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204220.20765: Calling groups_plugins_play to load vars for managed-node1 41016 1727204220.22049: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204220.23442: done with get_vars() 41016 1727204220.23463: done getting variables 41016 1727204220.23499: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Tuesday 24 September 2024 14:57:00 -0400 (0:00:00.138) 0:00:43.911 ***** 41016 1727204220.23523: entering _queue_task() for managed-node1/shell 41016 1727204220.23941: worker is 1 (out of 1 available) 41016 1727204220.23953: exiting _queue_task() for managed-node1/shell 41016 1727204220.23965: done queuing things up, now waiting for results queue to drain 41016 1727204220.23967: waiting for pending results... 41016 1727204220.24387: running TaskExecutor() for managed-node1/TASK: Check routes and DNS 41016 1727204220.24687: in run() - task 028d2410-947f-12d5-0ec4-000000000bb6 41016 1727204220.24692: variable 'ansible_search_path' from source: unknown 41016 1727204220.24694: variable 'ansible_search_path' from source: unknown 41016 1727204220.24696: calling self._execute() 41016 1727204220.24742: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204220.24745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204220.24755: variable 'omit' from source: magic vars 41016 1727204220.25553: variable 'ansible_distribution_major_version' from source: facts 41016 1727204220.25564: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204220.25569: variable 'omit' from source: magic vars 41016 1727204220.25619: variable 'omit' from source: magic vars 41016 1727204220.25678: variable 'omit' from source: magic vars 41016 1727204220.25725: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204220.25993: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204220.25996: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204220.26000: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204220.26003: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204220.26006: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204220.26008: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204220.26013: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204220.26016: Set connection var ansible_shell_executable to /bin/sh 41016 1727204220.26018: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204220.26020: Set connection var ansible_shell_type to sh 41016 1727204220.26023: Set connection var ansible_timeout to 10 41016 1727204220.26025: Set connection var ansible_pipelining to False 41016 1727204220.26026: Set connection var ansible_connection to ssh 41016 1727204220.26028: variable 'ansible_shell_executable' from source: unknown 41016 1727204220.26030: variable 'ansible_connection' from source: unknown 41016 1727204220.26033: variable 'ansible_module_compression' from source: unknown 41016 1727204220.26035: variable 'ansible_shell_type' from source: unknown 41016 1727204220.26040: variable 'ansible_shell_executable' from source: unknown 41016 1727204220.26042: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204220.26044: variable 'ansible_pipelining' from source: unknown 41016 1727204220.26046: variable 'ansible_timeout' from source: unknown 41016 1727204220.26048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204220.26384: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204220.26401: variable 'omit' from source: magic vars 41016 1727204220.26413: starting attempt loop 41016 1727204220.26417: running the handler 41016 1727204220.26425: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204220.26440: _low_level_execute_command(): starting 41016 1727204220.26448: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204220.28169: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204220.28363: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204220.28580: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204220.30255: stdout chunk (state=3): >>>/root <<< 41016 1727204220.30349: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204220.30389: stderr chunk (state=3): >>><<< 41016 1727204220.30396: stdout chunk (state=3): >>><<< 41016 1727204220.30418: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204220.30434: _low_level_execute_command(): starting 41016 1727204220.30441: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204220.3041952-43817-216637577657571 `" && echo ansible-tmp-1727204220.3041952-43817-216637577657571="` echo /root/.ansible/tmp/ansible-tmp-1727204220.3041952-43817-216637577657571 `" ) && sleep 0' 41016 1727204220.30861: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204220.30894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 41016 1727204220.30897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204220.30908: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204220.30911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204220.30953: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204220.30956: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204220.31044: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204220.33163: stdout chunk (state=3): >>>ansible-tmp-1727204220.3041952-43817-216637577657571=/root/.ansible/tmp/ansible-tmp-1727204220.3041952-43817-216637577657571 <<< 41016 1727204220.33465: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204220.33469: stderr chunk (state=3): >>><<< 41016 1727204220.33472: stdout chunk (state=3): >>><<< 41016 1727204220.33597: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204220.3041952-43817-216637577657571=/root/.ansible/tmp/ansible-tmp-1727204220.3041952-43817-216637577657571 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204220.33634: variable 'ansible_module_compression' from source: unknown 41016 1727204220.33686: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-410168h8uvyln/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41016 1727204220.33725: variable 'ansible_facts' from source: unknown 41016 1727204220.33924: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204220.3041952-43817-216637577657571/AnsiballZ_command.py 41016 1727204220.34163: Sending initial data 41016 1727204220.34167: Sent initial data (156 bytes) 41016 1727204220.34637: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204220.34645: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204220.34655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204220.34669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204220.34771: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204220.34783: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204220.34791: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204220.34812: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204220.34916: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204220.36685: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204220.36759: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204220.36896: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmpiwpzc2c6 /root/.ansible/tmp/ansible-tmp-1727204220.3041952-43817-216637577657571/AnsiballZ_command.py <<< 41016 1727204220.36899: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204220.3041952-43817-216637577657571/AnsiballZ_command.py" <<< 41016 1727204220.36934: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmpiwpzc2c6" to remote "/root/.ansible/tmp/ansible-tmp-1727204220.3041952-43817-216637577657571/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204220.3041952-43817-216637577657571/AnsiballZ_command.py" <<< 41016 1727204220.38050: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204220.38054: stdout chunk (state=3): >>><<< 41016 1727204220.38059: stderr chunk (state=3): >>><<< 41016 1727204220.38097: done transferring module to remote 41016 1727204220.38107: _low_level_execute_command(): starting 41016 1727204220.38126: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204220.3041952-43817-216637577657571/ /root/.ansible/tmp/ansible-tmp-1727204220.3041952-43817-216637577657571/AnsiballZ_command.py && sleep 0' 41016 1727204220.38710: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204220.38781: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204220.38790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204220.38796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204220.38799: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204220.38802: stderr chunk (state=3): >>>debug2: match not found <<< 41016 1727204220.38804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204220.38806: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41016 1727204220.38813: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 41016 1727204220.38815: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41016 1727204220.38887: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204220.38934: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204220.39019: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204220.41048: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204220.41061: stderr chunk (state=3): >>><<< 41016 1727204220.41069: stdout chunk (state=3): >>><<< 41016 1727204220.41108: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204220.41117: _low_level_execute_command(): starting 41016 1727204220.41127: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204220.3041952-43817-216637577657571/AnsiballZ_command.py && sleep 0' 41016 1727204220.41756: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204220.41770: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204220.41786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204220.41802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204220.41818: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204220.41835: stderr chunk (state=3): >>>debug2: match not found <<< 41016 1727204220.41848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204220.41890: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204220.41957: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204220.41974: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204220.41996: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204220.42113: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204220.59418: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:dd:89:9b:e5 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.14.47/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 2823sec preferred_lft 2823sec\n inet6 fe80::8ff:ddff:fe89:9be5/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.47 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.47 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:57:00.582863", "end": "2024-09-24 14:57:00.592311", "delta": "0:00:00.009448", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41016 1727204220.61249: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 41016 1727204220.61261: stdout chunk (state=3): >>><<< 41016 1727204220.61383: stderr chunk (state=3): >>><<< 41016 1727204220.61387: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:dd:89:9b:e5 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.14.47/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 2823sec preferred_lft 2823sec\n inet6 fe80::8ff:ddff:fe89:9be5/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.47 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.47 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:57:00.582863", "end": "2024-09-24 14:57:00.592311", "delta": "0:00:00.009448", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204220.61397: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204220.3041952-43817-216637577657571/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204220.61399: _low_level_execute_command(): starting 41016 1727204220.61402: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204220.3041952-43817-216637577657571/ > /dev/null 2>&1 && sleep 0' 41016 1727204220.62004: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204220.62069: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204220.62125: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204220.62141: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204220.62187: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204220.62274: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204220.64248: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204220.64266: stderr chunk (state=3): >>><<< 41016 1727204220.64269: stdout chunk (state=3): >>><<< 41016 1727204220.64305: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204220.64309: handler run complete 41016 1727204220.64333: Evaluated conditional (False): False 41016 1727204220.64336: attempt loop complete, returning result 41016 1727204220.64338: _execute() done 41016 1727204220.64344: dumping result to json 41016 1727204220.64350: done dumping result, returning 41016 1727204220.64358: done running TaskExecutor() for managed-node1/TASK: Check routes and DNS [028d2410-947f-12d5-0ec4-000000000bb6] 41016 1727204220.64361: sending task result for task 028d2410-947f-12d5-0ec4-000000000bb6 41016 1727204220.64484: done sending task result for task 028d2410-947f-12d5-0ec4-000000000bb6 41016 1727204220.64487: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.009448", "end": "2024-09-24 14:57:00.592311", "rc": 0, "start": "2024-09-24 14:57:00.582863" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 0a:ff:dd:89:9b:e5 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.14.47/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0 valid_lft 2823sec preferred_lft 2823sec inet6 fe80::8ff:ddff:fe89:9be5/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.47 metric 100 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.47 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 41016 1727204220.64566: no more pending results, returning what we have 41016 1727204220.64570: results queue empty 41016 1727204220.64571: checking for any_errors_fatal 41016 1727204220.64573: done checking for any_errors_fatal 41016 1727204220.64573: checking for max_fail_percentage 41016 1727204220.64577: done checking for max_fail_percentage 41016 1727204220.64578: checking to see if all hosts have failed and the running result is not ok 41016 1727204220.64579: done checking to see if all hosts have failed 41016 1727204220.64579: getting the remaining hosts for this loop 41016 1727204220.64581: done getting the remaining hosts for this loop 41016 1727204220.64584: getting the next task for host managed-node1 41016 1727204220.64591: done getting next task for host managed-node1 41016 1727204220.64593: ^ task is: TASK: Verify DNS and network connectivity 41016 1727204220.64597: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 41016 1727204220.64606: getting variables 41016 1727204220.64608: in VariableManager get_vars() 41016 1727204220.64649: Calling all_inventory to load vars for managed-node1 41016 1727204220.64652: Calling groups_inventory to load vars for managed-node1 41016 1727204220.64654: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204220.64664: Calling all_plugins_play to load vars for managed-node1 41016 1727204220.64667: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204220.64669: Calling groups_plugins_play to load vars for managed-node1 41016 1727204220.66202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204220.67782: done with get_vars() 41016 1727204220.67809: done getting variables 41016 1727204220.67873: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Tuesday 24 September 2024 14:57:00 -0400 (0:00:00.443) 0:00:44.355 ***** 41016 1727204220.67909: entering _queue_task() for managed-node1/shell 41016 1727204220.68373: worker is 1 (out of 1 available) 41016 1727204220.68387: exiting _queue_task() for managed-node1/shell 41016 1727204220.68399: done queuing things up, now waiting for results queue to drain 41016 1727204220.68400: waiting for pending results... 41016 1727204220.68650: running TaskExecutor() for managed-node1/TASK: Verify DNS and network connectivity 41016 1727204220.68732: in run() - task 028d2410-947f-12d5-0ec4-000000000bb7 41016 1727204220.68742: variable 'ansible_search_path' from source: unknown 41016 1727204220.68746: variable 'ansible_search_path' from source: unknown 41016 1727204220.68774: calling self._execute() 41016 1727204220.68855: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204220.68858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204220.68868: variable 'omit' from source: magic vars 41016 1727204220.69212: variable 'ansible_distribution_major_version' from source: facts 41016 1727204220.69217: Evaluated conditional (ansible_distribution_major_version != '6'): True 41016 1727204220.69359: variable 'ansible_facts' from source: unknown 41016 1727204220.70132: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 41016 1727204220.70136: variable 'omit' from source: magic vars 41016 1727204220.70139: variable 'omit' from source: magic vars 41016 1727204220.70154: variable 'omit' from source: magic vars 41016 1727204220.70191: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 41016 1727204220.70222: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 41016 1727204220.70279: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 41016 1727204220.70326: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204220.70364: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 41016 1727204220.70484: variable 'inventory_hostname' from source: host vars for 'managed-node1' 41016 1727204220.70487: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204220.70490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204220.70986: Set connection var ansible_shell_executable to /bin/sh 41016 1727204220.70988: Set connection var ansible_module_compression to ZIP_DEFLATED 41016 1727204220.70990: Set connection var ansible_shell_type to sh 41016 1727204220.70992: Set connection var ansible_timeout to 10 41016 1727204220.70994: Set connection var ansible_pipelining to False 41016 1727204220.70996: Set connection var ansible_connection to ssh 41016 1727204220.70998: variable 'ansible_shell_executable' from source: unknown 41016 1727204220.71000: variable 'ansible_connection' from source: unknown 41016 1727204220.71002: variable 'ansible_module_compression' from source: unknown 41016 1727204220.71003: variable 'ansible_shell_type' from source: unknown 41016 1727204220.71005: variable 'ansible_shell_executable' from source: unknown 41016 1727204220.71007: variable 'ansible_host' from source: host vars for 'managed-node1' 41016 1727204220.71008: variable 'ansible_pipelining' from source: unknown 41016 1727204220.71013: variable 'ansible_timeout' from source: unknown 41016 1727204220.71016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 41016 1727204220.71019: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204220.71021: variable 'omit' from source: magic vars 41016 1727204220.71023: starting attempt loop 41016 1727204220.71025: running the handler 41016 1727204220.71082: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 41016 1727204220.71085: _low_level_execute_command(): starting 41016 1727204220.71088: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 41016 1727204220.71736: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204220.71748: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204220.71781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204220.71786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204220.71790: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204220.71800: stderr chunk (state=3): >>>debug2: match not found <<< 41016 1727204220.71802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204220.71881: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41016 1727204220.71884: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 41016 1727204220.71886: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 41016 1727204220.71888: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204220.71890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204220.71892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204220.71894: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204220.71895: stderr chunk (state=3): >>>debug2: match found <<< 41016 1727204220.71897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204220.71944: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204220.71955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204220.71973: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204220.72087: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204220.73892: stdout chunk (state=3): >>>/root <<< 41016 1727204220.74080: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204220.74104: stdout chunk (state=3): >>><<< 41016 1727204220.74107: stderr chunk (state=3): >>><<< 41016 1727204220.74132: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204220.74289: _low_level_execute_command(): starting 41016 1727204220.74294: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204220.7415855-43867-243832878555819 `" && echo ansible-tmp-1727204220.7415855-43867-243832878555819="` echo /root/.ansible/tmp/ansible-tmp-1727204220.7415855-43867-243832878555819 `" ) && sleep 0' 41016 1727204220.74972: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 41016 1727204220.74989: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204220.75046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204220.75103: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204220.75127: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204220.75251: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204220.77363: stdout chunk (state=3): >>>ansible-tmp-1727204220.7415855-43867-243832878555819=/root/.ansible/tmp/ansible-tmp-1727204220.7415855-43867-243832878555819 <<< 41016 1727204220.77481: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204220.77560: stderr chunk (state=3): >>><<< 41016 1727204220.77563: stdout chunk (state=3): >>><<< 41016 1727204220.77629: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204220.7415855-43867-243832878555819=/root/.ansible/tmp/ansible-tmp-1727204220.7415855-43867-243832878555819 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204220.77658: variable 'ansible_module_compression' from source: unknown 41016 1727204220.77777: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-410168h8uvyln/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 41016 1727204220.77780: variable 'ansible_facts' from source: unknown 41016 1727204220.77881: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204220.7415855-43867-243832878555819/AnsiballZ_command.py 41016 1727204220.78364: Sending initial data 41016 1727204220.78367: Sent initial data (156 bytes) 41016 1727204220.79693: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204220.79778: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204220.79882: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204220.80052: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204220.80177: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204220.81926: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 41016 1727204220.82044: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 41016 1727204220.82199: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-410168h8uvyln/tmpftmkk80r /root/.ansible/tmp/ansible-tmp-1727204220.7415855-43867-243832878555819/AnsiballZ_command.py <<< 41016 1727204220.82203: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204220.7415855-43867-243832878555819/AnsiballZ_command.py" <<< 41016 1727204220.82398: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-410168h8uvyln/tmpftmkk80r" to remote "/root/.ansible/tmp/ansible-tmp-1727204220.7415855-43867-243832878555819/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204220.7415855-43867-243832878555819/AnsiballZ_command.py" <<< 41016 1727204220.84056: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204220.84061: stdout chunk (state=3): >>><<< 41016 1727204220.84068: stderr chunk (state=3): >>><<< 41016 1727204220.84103: done transferring module to remote 41016 1727204220.84116: _low_level_execute_command(): starting 41016 1727204220.84119: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204220.7415855-43867-243832878555819/ /root/.ansible/tmp/ansible-tmp-1727204220.7415855-43867-243832878555819/AnsiballZ_command.py && sleep 0' 41016 1727204220.85617: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204220.86040: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204220.86049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204220.86052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204220.86054: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204220.86056: stderr chunk (state=3): >>>debug2: match not found <<< 41016 1727204220.86058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204220.86060: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41016 1727204220.86062: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 41016 1727204220.86283: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204220.86288: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204220.86512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204220.88596: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204220.88600: stdout chunk (state=3): >>><<< 41016 1727204220.88606: stderr chunk (state=3): >>><<< 41016 1727204220.88623: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204220.88627: _low_level_execute_command(): starting 41016 1727204220.88633: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204220.7415855-43867-243832878555819/AnsiballZ_command.py && sleep 0' 41016 1727204220.89396: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 41016 1727204220.89496: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 41016 1727204220.89509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204220.89522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 41016 1727204220.89535: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 41016 1727204220.89547: stderr chunk (state=3): >>>debug2: match not found <<< 41016 1727204220.89557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204220.89710: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 41016 1727204220.89715: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204220.89718: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 41016 1727204220.89720: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204220.89807: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204221.15219: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 6773 0 --:--:-- --:--:-- --:--:-- 6931\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 14737 0 --:--:-- --:--:-- --:--:-- 15315", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-24 14:57:01.059539", "end": "2024-09-24 14:57:01.149586", "delta": "0:00:00.090047", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 41016 1727204221.17086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 41016 1727204221.17096: stdout chunk (state=3): >>><<< 41016 1727204221.17098: stderr chunk (state=3): >>><<< 41016 1727204221.17101: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 6773 0 --:--:-- --:--:-- --:--:-- 6931\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 14737 0 --:--:-- --:--:-- --:--:-- 15315", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-24 14:57:01.059539", "end": "2024-09-24 14:57:01.149586", "delta": "0:00:00.090047", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 41016 1727204221.17104: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204220.7415855-43867-243832878555819/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 41016 1727204221.17116: _low_level_execute_command(): starting 41016 1727204221.17126: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204220.7415855-43867-243832878555819/ > /dev/null 2>&1 && sleep 0' 41016 1727204221.17811: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204221.17826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 41016 1727204221.17836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 41016 1727204221.17884: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 41016 1727204221.17897: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 41016 1727204221.17990: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 41016 1727204221.19997: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 41016 1727204221.20028: stderr chunk (state=3): >>><<< 41016 1727204221.20032: stdout chunk (state=3): >>><<< 41016 1727204221.20050: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 41016 1727204221.20059: handler run complete 41016 1727204221.20075: Evaluated conditional (False): False 41016 1727204221.20085: attempt loop complete, returning result 41016 1727204221.20088: _execute() done 41016 1727204221.20090: dumping result to json 41016 1727204221.20096: done dumping result, returning 41016 1727204221.20103: done running TaskExecutor() for managed-node1/TASK: Verify DNS and network connectivity [028d2410-947f-12d5-0ec4-000000000bb7] 41016 1727204221.20107: sending task result for task 028d2410-947f-12d5-0ec4-000000000bb7 41016 1727204221.20213: done sending task result for task 028d2410-947f-12d5-0ec4-000000000bb7 41016 1727204221.20217: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.090047", "end": "2024-09-24 14:57:01.149586", "rc": 0, "start": "2024-09-24 14:57:01.059539" } STDOUT: CHECK DNS AND CONNECTIVITY 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 6773 0 --:--:-- --:--:-- --:--:-- 6931 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 14737 0 --:--:-- --:--:-- --:--:-- 15315 41016 1727204221.20286: no more pending results, returning what we have 41016 1727204221.20290: results queue empty 41016 1727204221.20291: checking for any_errors_fatal 41016 1727204221.20300: done checking for any_errors_fatal 41016 1727204221.20301: checking for max_fail_percentage 41016 1727204221.20303: done checking for max_fail_percentage 41016 1727204221.20304: checking to see if all hosts have failed and the running result is not ok 41016 1727204221.20305: done checking to see if all hosts have failed 41016 1727204221.20306: getting the remaining hosts for this loop 41016 1727204221.20308: done getting the remaining hosts for this loop 41016 1727204221.20311: getting the next task for host managed-node1 41016 1727204221.20321: done getting next task for host managed-node1 41016 1727204221.20323: ^ task is: TASK: meta (flush_handlers) 41016 1727204221.20325: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204221.20330: getting variables 41016 1727204221.20332: in VariableManager get_vars() 41016 1727204221.20373: Calling all_inventory to load vars for managed-node1 41016 1727204221.20384: Calling groups_inventory to load vars for managed-node1 41016 1727204221.20388: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204221.20400: Calling all_plugins_play to load vars for managed-node1 41016 1727204221.20403: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204221.20405: Calling groups_plugins_play to load vars for managed-node1 41016 1727204221.21223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204221.22113: done with get_vars() 41016 1727204221.22137: done getting variables 41016 1727204221.22192: in VariableManager get_vars() 41016 1727204221.22204: Calling all_inventory to load vars for managed-node1 41016 1727204221.22206: Calling groups_inventory to load vars for managed-node1 41016 1727204221.22207: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204221.22211: Calling all_plugins_play to load vars for managed-node1 41016 1727204221.22213: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204221.22214: Calling groups_plugins_play to load vars for managed-node1 41016 1727204221.22960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204221.23834: done with get_vars() 41016 1727204221.23864: done queuing things up, now waiting for results queue to drain 41016 1727204221.23866: results queue empty 41016 1727204221.23866: checking for any_errors_fatal 41016 1727204221.23869: done checking for any_errors_fatal 41016 1727204221.23870: checking for max_fail_percentage 41016 1727204221.23871: done checking for max_fail_percentage 41016 1727204221.23871: checking to see if all hosts have failed and the running result is not ok 41016 1727204221.23872: done checking to see if all hosts have failed 41016 1727204221.23872: getting the remaining hosts for this loop 41016 1727204221.23873: done getting the remaining hosts for this loop 41016 1727204221.23877: getting the next task for host managed-node1 41016 1727204221.23880: done getting next task for host managed-node1 41016 1727204221.23881: ^ task is: TASK: meta (flush_handlers) 41016 1727204221.23882: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204221.23885: getting variables 41016 1727204221.23885: in VariableManager get_vars() 41016 1727204221.23896: Calling all_inventory to load vars for managed-node1 41016 1727204221.23897: Calling groups_inventory to load vars for managed-node1 41016 1727204221.23899: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204221.23905: Calling all_plugins_play to load vars for managed-node1 41016 1727204221.23906: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204221.23908: Calling groups_plugins_play to load vars for managed-node1 41016 1727204221.24582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204221.25482: done with get_vars() 41016 1727204221.25499: done getting variables 41016 1727204221.25538: in VariableManager get_vars() 41016 1727204221.25549: Calling all_inventory to load vars for managed-node1 41016 1727204221.25551: Calling groups_inventory to load vars for managed-node1 41016 1727204221.25552: Calling all_plugins_inventory to load vars for managed-node1 41016 1727204221.25556: Calling all_plugins_play to load vars for managed-node1 41016 1727204221.25561: Calling groups_plugins_inventory to load vars for managed-node1 41016 1727204221.25563: Calling groups_plugins_play to load vars for managed-node1 41016 1727204221.26215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 41016 1727204221.27079: done with get_vars() 41016 1727204221.27107: done queuing things up, now waiting for results queue to drain 41016 1727204221.27109: results queue empty 41016 1727204221.27109: checking for any_errors_fatal 41016 1727204221.27111: done checking for any_errors_fatal 41016 1727204221.27112: checking for max_fail_percentage 41016 1727204221.27113: done checking for max_fail_percentage 41016 1727204221.27113: checking to see if all hosts have failed and the running result is not ok 41016 1727204221.27114: done checking to see if all hosts have failed 41016 1727204221.27114: getting the remaining hosts for this loop 41016 1727204221.27115: done getting the remaining hosts for this loop 41016 1727204221.27117: getting the next task for host managed-node1 41016 1727204221.27120: done getting next task for host managed-node1 41016 1727204221.27120: ^ task is: None 41016 1727204221.27121: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 41016 1727204221.27122: done queuing things up, now waiting for results queue to drain 41016 1727204221.27123: results queue empty 41016 1727204221.27123: checking for any_errors_fatal 41016 1727204221.27124: done checking for any_errors_fatal 41016 1727204221.27124: checking for max_fail_percentage 41016 1727204221.27125: done checking for max_fail_percentage 41016 1727204221.27125: checking to see if all hosts have failed and the running result is not ok 41016 1727204221.27126: done checking to see if all hosts have failed 41016 1727204221.27127: getting the next task for host managed-node1 41016 1727204221.27129: done getting next task for host managed-node1 41016 1727204221.27129: ^ task is: None 41016 1727204221.27130: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node1 : ok=108 changed=3 unreachable=0 failed=0 skipped=87 rescued=0 ignored=2 Tuesday 24 September 2024 14:57:01 -0400 (0:00:00.592) 0:00:44.948 ***** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.11s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.10s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.05s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.52s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml:6 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.39s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 1.37s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Create veth interface ethtest0 ------------------------------------------ 1.30s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Gathering Facts --------------------------------------------------------- 1.23s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:3 Create veth interface ethtest1 ------------------------------------------ 1.20s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.08s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Install iproute --------------------------------------------------------- 1.02s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Check which packages are installed --- 0.96s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gather the minimum subset of ansible_facts required by the network role test --- 0.91s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 0.82s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Check if system is ostree ----------------------------------------------- 0.81s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Install iproute --------------------------------------------------------- 0.77s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.76s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.76s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.68s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Remove test interface if necessary -------------------------------------- 0.67s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 41016 1727204221.27235: RUNNING CLEANUP