49915 1727204292.41914: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-bGV executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 49915 1727204292.43268: Added group all to inventory 49915 1727204292.43270: Added group ungrouped to inventory 49915 1727204292.43277: Group all now contains ungrouped 49915 1727204292.43281: Examining possible inventory source: /tmp/network-zt6/inventory-rSl.yml 49915 1727204292.78716: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 49915 1727204292.78956: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 49915 1727204292.78981: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 49915 1727204292.79044: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 49915 1727204292.79240: Loaded config def from plugin (inventory/script) 49915 1727204292.79242: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 49915 1727204292.79399: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 49915 1727204292.79584: Loaded config def from plugin (inventory/yaml) 49915 1727204292.79587: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 49915 1727204292.79709: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 49915 1727204292.80743: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 49915 1727204292.80746: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 49915 1727204292.80749: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 49915 1727204292.80755: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 49915 1727204292.80760: Loading data from /tmp/network-zt6/inventory-rSl.yml 49915 1727204292.80889: /tmp/network-zt6/inventory-rSl.yml was not parsable by auto 49915 1727204292.81011: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 49915 1727204292.81087: Loading data from /tmp/network-zt6/inventory-rSl.yml 49915 1727204292.81329: group all already in inventory 49915 1727204292.81337: set inventory_file for managed-node1 49915 1727204292.81341: set inventory_dir for managed-node1 49915 1727204292.81342: Added host managed-node1 to inventory 49915 1727204292.81344: Added host managed-node1 to group all 49915 1727204292.81345: set ansible_host for managed-node1 49915 1727204292.81346: set ansible_ssh_extra_args for managed-node1 49915 1727204292.81350: set inventory_file for managed-node2 49915 1727204292.81352: set inventory_dir for managed-node2 49915 1727204292.81353: Added host managed-node2 to inventory 49915 1727204292.81355: Added host managed-node2 to group all 49915 1727204292.81356: set ansible_host for managed-node2 49915 1727204292.81357: set ansible_ssh_extra_args for managed-node2 49915 1727204292.81359: set inventory_file for managed-node3 49915 1727204292.81362: set inventory_dir for managed-node3 49915 1727204292.81362: Added host managed-node3 to inventory 49915 1727204292.81364: Added host managed-node3 to group all 49915 1727204292.81365: set ansible_host for managed-node3 49915 1727204292.81480: set ansible_ssh_extra_args for managed-node3 49915 1727204292.81484: Reconcile groups and hosts in inventory. 49915 1727204292.81488: Group ungrouped now contains managed-node1 49915 1727204292.81490: Group ungrouped now contains managed-node2 49915 1727204292.81492: Group ungrouped now contains managed-node3 49915 1727204292.81575: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 49915 1727204292.81929: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 49915 1727204292.82027: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 49915 1727204292.82054: Loaded config def from plugin (vars/host_group_vars) 49915 1727204292.82056: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 49915 1727204292.82064: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 49915 1727204292.82072: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 49915 1727204292.82120: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 49915 1727204292.83569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204292.83669: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 49915 1727204292.84118: Loaded config def from plugin (connection/local) 49915 1727204292.84122: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 49915 1727204292.85734: Loaded config def from plugin (connection/paramiko_ssh) 49915 1727204292.85738: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 49915 1727204292.87684: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 49915 1727204292.87797: Loaded config def from plugin (connection/psrp) 49915 1727204292.87801: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 49915 1727204292.89470: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 49915 1727204292.89510: Loaded config def from plugin (connection/ssh) 49915 1727204292.89515: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 49915 1727204292.94344: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 49915 1727204292.94387: Loaded config def from plugin (connection/winrm) 49915 1727204292.94391: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 49915 1727204292.94504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 49915 1727204292.94636: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 49915 1727204292.94706: Loaded config def from plugin (shell/cmd) 49915 1727204292.94709: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 49915 1727204292.94853: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 49915 1727204292.94924: Loaded config def from plugin (shell/powershell) 49915 1727204292.94926: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 49915 1727204292.95127: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 49915 1727204292.95572: Loaded config def from plugin (shell/sh) 49915 1727204292.95574: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 49915 1727204292.95655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 49915 1727204292.95958: Loaded config def from plugin (become/runas) 49915 1727204292.95960: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 49915 1727204292.96417: Loaded config def from plugin (become/su) 49915 1727204292.96420: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 49915 1727204292.96728: Loaded config def from plugin (become/sudo) 49915 1727204292.96730: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 49915 1727204292.96765: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_vlan_mtu_nm.yml 49915 1727204292.97534: in VariableManager get_vars() 49915 1727204292.97673: done with get_vars() 49915 1727204292.97927: trying /usr/local/lib/python3.12/site-packages/ansible/modules 49915 1727204293.04821: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 49915 1727204293.05054: in VariableManager get_vars() 49915 1727204293.05059: done with get_vars() 49915 1727204293.05062: variable 'playbook_dir' from source: magic vars 49915 1727204293.05063: variable 'ansible_playbook_python' from source: magic vars 49915 1727204293.05064: variable 'ansible_config_file' from source: magic vars 49915 1727204293.05065: variable 'groups' from source: magic vars 49915 1727204293.05065: variable 'omit' from source: magic vars 49915 1727204293.05066: variable 'ansible_version' from source: magic vars 49915 1727204293.05067: variable 'ansible_check_mode' from source: magic vars 49915 1727204293.05067: variable 'ansible_diff_mode' from source: magic vars 49915 1727204293.05068: variable 'ansible_forks' from source: magic vars 49915 1727204293.05069: variable 'ansible_inventory_sources' from source: magic vars 49915 1727204293.05069: variable 'ansible_skip_tags' from source: magic vars 49915 1727204293.05070: variable 'ansible_limit' from source: magic vars 49915 1727204293.05071: variable 'ansible_run_tags' from source: magic vars 49915 1727204293.05071: variable 'ansible_verbosity' from source: magic vars 49915 1727204293.05110: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml 49915 1727204293.05973: in VariableManager get_vars() 49915 1727204293.06095: done with get_vars() 49915 1727204293.06143: in VariableManager get_vars() 49915 1727204293.06157: done with get_vars() 49915 1727204293.06244: in VariableManager get_vars() 49915 1727204293.06266: done with get_vars() 49915 1727204293.06681: in VariableManager get_vars() 49915 1727204293.06695: done with get_vars() 49915 1727204293.06700: variable 'omit' from source: magic vars 49915 1727204293.06723: variable 'omit' from source: magic vars 49915 1727204293.06758: in VariableManager get_vars() 49915 1727204293.06856: done with get_vars() 49915 1727204293.06911: in VariableManager get_vars() 49915 1727204293.06970: done with get_vars() 49915 1727204293.07132: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 49915 1727204293.07572: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 49915 1727204293.07834: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 49915 1727204293.09408: in VariableManager get_vars() 49915 1727204293.09432: done with get_vars() 49915 1727204293.10471: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 49915 1727204293.10815: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 49915 1727204293.14414: in VariableManager get_vars() 49915 1727204293.14435: done with get_vars() 49915 1727204293.14594: in VariableManager get_vars() 49915 1727204293.14632: done with get_vars() 49915 1727204293.14862: in VariableManager get_vars() 49915 1727204293.14882: done with get_vars() 49915 1727204293.14983: variable 'omit' from source: magic vars 49915 1727204293.14999: variable 'omit' from source: magic vars 49915 1727204293.15035: in VariableManager get_vars() 49915 1727204293.15049: done with get_vars() 49915 1727204293.15070: in VariableManager get_vars() 49915 1727204293.15229: done with get_vars() 49915 1727204293.15259: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 49915 1727204293.15551: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 49915 1727204293.15635: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 49915 1727204293.16801: in VariableManager get_vars() 49915 1727204293.16826: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 49915 1727204293.20912: in VariableManager get_vars() 49915 1727204293.20937: done with get_vars() 49915 1727204293.21180: in VariableManager get_vars() 49915 1727204293.21199: done with get_vars() 49915 1727204293.21252: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 49915 1727204293.21267: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 49915 1727204293.25664: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 49915 1727204293.26061: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 49915 1727204293.26065: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-bGV/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 49915 1727204293.26098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 49915 1727204293.26121: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 49915 1727204293.26461: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 49915 1727204293.26776: Loaded config def from plugin (callback/default) 49915 1727204293.26779: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 49915 1727204293.29018: Loaded config def from plugin (callback/junit) 49915 1727204293.29021: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 49915 1727204293.29066: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 49915 1727204293.29232: Loaded config def from plugin (callback/minimal) 49915 1727204293.29235: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 49915 1727204293.29273: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 49915 1727204293.29333: Loaded config def from plugin (callback/tree) 49915 1727204293.29336: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 49915 1727204293.29650: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 49915 1727204293.29653: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-bGV/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_vlan_mtu_nm.yml ************************************************ 2 plays in /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_vlan_mtu_nm.yml 49915 1727204293.29683: in VariableManager get_vars() 49915 1727204293.29698: done with get_vars() 49915 1727204293.29704: in VariableManager get_vars() 49915 1727204293.29714: done with get_vars() 49915 1727204293.29718: variable 'omit' from source: magic vars 49915 1727204293.29754: in VariableManager get_vars() 49915 1727204293.29768: done with get_vars() 49915 1727204293.29993: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_vlan_mtu.yml' with nm as provider] ********* 49915 1727204293.30949: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 49915 1727204293.31226: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 49915 1727204293.31484: getting the remaining hosts for this loop 49915 1727204293.31486: done getting the remaining hosts for this loop 49915 1727204293.31489: getting the next task for host managed-node2 49915 1727204293.31493: done getting next task for host managed-node2 49915 1727204293.31495: ^ task is: TASK: Gathering Facts 49915 1727204293.31497: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204293.31499: getting variables 49915 1727204293.31500: in VariableManager get_vars() 49915 1727204293.31512: Calling all_inventory to load vars for managed-node2 49915 1727204293.31515: Calling groups_inventory to load vars for managed-node2 49915 1727204293.31517: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204293.31529: Calling all_plugins_play to load vars for managed-node2 49915 1727204293.31540: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204293.31544: Calling groups_plugins_play to load vars for managed-node2 49915 1727204293.31579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204293.31632: done with get_vars() 49915 1727204293.31639: done getting variables 49915 1727204293.31906: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_vlan_mtu_nm.yml:6 Tuesday 24 September 2024 14:58:13 -0400 (0:00:00.025) 0:00:00.025 ***** 49915 1727204293.31929: entering _queue_task() for managed-node2/gather_facts 49915 1727204293.31930: Creating lock for gather_facts 49915 1727204293.32689: worker is 1 (out of 1 available) 49915 1727204293.32699: exiting _queue_task() for managed-node2/gather_facts 49915 1727204293.32711: done queuing things up, now waiting for results queue to drain 49915 1727204293.32713: waiting for pending results... 49915 1727204293.33215: running TaskExecutor() for managed-node2/TASK: Gathering Facts 49915 1727204293.33380: in run() - task 028d2410-947f-dcd7-b5af-0000000000af 49915 1727204293.33532: variable 'ansible_search_path' from source: unknown 49915 1727204293.33567: calling self._execute() 49915 1727204293.33860: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204293.33883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204293.33886: variable 'omit' from source: magic vars 49915 1727204293.34206: variable 'omit' from source: magic vars 49915 1727204293.34462: variable 'omit' from source: magic vars 49915 1727204293.34466: variable 'omit' from source: magic vars 49915 1727204293.34474: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204293.34702: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204293.34841: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204293.34858: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204293.34869: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204293.34900: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204293.34904: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204293.34907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204293.35488: Set connection var ansible_connection to ssh 49915 1727204293.35491: Set connection var ansible_shell_type to sh 49915 1727204293.35499: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204293.35539: Set connection var ansible_shell_executable to /bin/sh 49915 1727204293.35542: Set connection var ansible_timeout to 10 49915 1727204293.35545: Set connection var ansible_pipelining to False 49915 1727204293.35553: variable 'ansible_shell_executable' from source: unknown 49915 1727204293.35555: variable 'ansible_connection' from source: unknown 49915 1727204293.35558: variable 'ansible_module_compression' from source: unknown 49915 1727204293.35560: variable 'ansible_shell_type' from source: unknown 49915 1727204293.35563: variable 'ansible_shell_executable' from source: unknown 49915 1727204293.35565: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204293.35567: variable 'ansible_pipelining' from source: unknown 49915 1727204293.35569: variable 'ansible_timeout' from source: unknown 49915 1727204293.35571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204293.36447: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204293.36451: variable 'omit' from source: magic vars 49915 1727204293.36453: starting attempt loop 49915 1727204293.36456: running the handler 49915 1727204293.36458: variable 'ansible_facts' from source: unknown 49915 1727204293.36533: _low_level_execute_command(): starting 49915 1727204293.36537: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49915 1727204293.38149: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204293.38198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204293.38201: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 49915 1727204293.38205: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 49915 1727204293.38207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204293.38473: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204293.38502: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204293.40297: stdout chunk (state=3): >>>/root <<< 49915 1727204293.40324: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204293.40544: stderr chunk (state=3): >>><<< 49915 1727204293.40548: stdout chunk (state=3): >>><<< 49915 1727204293.40550: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204293.40553: _low_level_execute_command(): starting 49915 1727204293.40555: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204293.404674-50017-136991992923155 `" && echo ansible-tmp-1727204293.404674-50017-136991992923155="` echo /root/.ansible/tmp/ansible-tmp-1727204293.404674-50017-136991992923155 `" ) && sleep 0' 49915 1727204293.41928: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204293.41932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 49915 1727204293.41934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204293.41936: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204293.41946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204293.42209: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204293.44103: stdout chunk (state=3): >>>ansible-tmp-1727204293.404674-50017-136991992923155=/root/.ansible/tmp/ansible-tmp-1727204293.404674-50017-136991992923155 <<< 49915 1727204293.44396: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204293.44400: stdout chunk (state=3): >>><<< 49915 1727204293.44402: stderr chunk (state=3): >>><<< 49915 1727204293.44405: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204293.404674-50017-136991992923155=/root/.ansible/tmp/ansible-tmp-1727204293.404674-50017-136991992923155 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204293.44407: variable 'ansible_module_compression' from source: unknown 49915 1727204293.44409: ANSIBALLZ: Using generic lock for ansible.legacy.setup 49915 1727204293.44411: ANSIBALLZ: Acquiring lock 49915 1727204293.44413: ANSIBALLZ: Lock acquired: 140698012046288 49915 1727204293.44415: ANSIBALLZ: Creating module 49915 1727204293.89749: ANSIBALLZ: Writing module into payload 49915 1727204293.89836: ANSIBALLZ: Writing module 49915 1727204293.89856: ANSIBALLZ: Renaming module 49915 1727204293.89863: ANSIBALLZ: Done creating module 49915 1727204293.89911: variable 'ansible_facts' from source: unknown 49915 1727204293.89920: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204293.89930: _low_level_execute_command(): starting 49915 1727204293.89937: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 49915 1727204293.90595: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204293.90654: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204293.90707: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204293.90723: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204293.90887: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204293.90938: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204293.92643: stdout chunk (state=3): >>>PLATFORM <<< 49915 1727204293.92724: stdout chunk (state=3): >>>Linux <<< 49915 1727204293.92729: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 <<< 49915 1727204293.92734: stdout chunk (state=3): >>>/usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 49915 1727204293.92919: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204293.92923: stdout chunk (state=3): >>><<< 49915 1727204293.92931: stderr chunk (state=3): >>><<< 49915 1727204293.93000: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204293.93080 [managed-node2]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 49915 1727204293.93087: _low_level_execute_command(): starting 49915 1727204293.93090: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 49915 1727204293.93343: Sending initial data 49915 1727204293.93347: Sent initial data (1181 bytes) 49915 1727204293.94181: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204293.94185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204293.94338: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204293.94382: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204293.94495: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204293.94593: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204293.99184: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 49915 1727204293.99313: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204293.99321: stderr chunk (state=3): >>><<< 49915 1727204293.99324: stdout chunk (state=3): >>><<< 49915 1727204293.99343: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204293.99429: variable 'ansible_facts' from source: unknown 49915 1727204293.99432: variable 'ansible_facts' from source: unknown 49915 1727204293.99446: variable 'ansible_module_compression' from source: unknown 49915 1727204293.99648: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49915ogiz3nec/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 49915 1727204293.99721: variable 'ansible_facts' from source: unknown 49915 1727204293.99944: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204293.404674-50017-136991992923155/AnsiballZ_setup.py 49915 1727204294.00295: Sending initial data 49915 1727204294.00298: Sent initial data (153 bytes) 49915 1727204294.01253: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204294.01290: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204294.01300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204294.01308: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 49915 1727204294.01385: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204294.01402: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204294.01510: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 49915 1727204294.03272: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 49915 1727204294.03278: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49915 1727204294.03493: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204293.404674-50017-136991992923155/AnsiballZ_setup.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49915ogiz3nec/tmpwwsppr4i" to remote "/root/.ansible/tmp/ansible-tmp-1727204293.404674-50017-136991992923155/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204293.404674-50017-136991992923155/AnsiballZ_setup.py" <<< 49915 1727204294.03497: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49915ogiz3nec/tmpwwsppr4i /root/.ansible/tmp/ansible-tmp-1727204293.404674-50017-136991992923155/AnsiballZ_setup.py <<< 49915 1727204294.06216: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204294.06430: stderr chunk (state=3): >>><<< 49915 1727204294.06434: stdout chunk (state=3): >>><<< 49915 1727204294.06437: done transferring module to remote 49915 1727204294.06450: _low_level_execute_command(): starting 49915 1727204294.06453: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204293.404674-50017-136991992923155/ /root/.ansible/tmp/ansible-tmp-1727204293.404674-50017-136991992923155/AnsiballZ_setup.py && sleep 0' 49915 1727204294.07535: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204294.07558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration <<< 49915 1727204294.07599: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204294.07602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204294.07664: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204294.07784: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204294.07787: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204294.07886: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204294.09784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204294.09919: stderr chunk (state=3): >>><<< 49915 1727204294.09923: stdout chunk (state=3): >>><<< 49915 1727204294.09925: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204294.09928: _low_level_execute_command(): starting 49915 1727204294.09930: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204293.404674-50017-136991992923155/AnsiballZ_setup.py && sleep 0' 49915 1727204294.11007: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204294.11281: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204294.11296: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204294.11785: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204294.14042: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 49915 1727204294.14056: stdout chunk (state=3): >>>import _imp # builtin <<< 49915 1727204294.14119: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 49915 1727204294.14240: stdout chunk (state=3): >>>import '_io' # import 'marshal' # import 'posix' # <<< 49915 1727204294.14246: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 49915 1727204294.14339: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 49915 1727204294.14406: stdout chunk (state=3): >>>import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 49915 1727204294.14448: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d09b44d0> <<< 49915 1727204294.14524: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d0983b00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d09b6a50> import '_signal' # <<< 49915 1727204294.14559: stdout chunk (state=3): >>>import '_abc' # <<< 49915 1727204294.14562: stdout chunk (state=3): >>>import 'abc' # <<< 49915 1727204294.14571: stdout chunk (state=3): >>>import 'io' # <<< 49915 1727204294.14657: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 49915 1727204294.14695: stdout chunk (state=3): >>>import '_collections_abc' # <<< 49915 1727204294.14754: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 49915 1727204294.14772: stdout chunk (state=3): >>>import 'os' # <<< 49915 1727204294.14777: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 49915 1727204294.14902: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 49915 1727204294.15011: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d0765130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d0766060> import 'site' # <<< 49915 1727204294.15021: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 49915 1727204294.15418: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 49915 1727204294.15438: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 49915 1727204294.15749: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d07a3ec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d07a3f80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 49915 1727204294.15939: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d07db8c0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d07dbf50> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d07bbb90> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d07b92b0> <<< 49915 1727204294.16002: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d07a1070> <<< 49915 1727204294.16053: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 49915 1727204294.16060: stdout chunk (state=3): >>>import '_sre' # <<< 49915 1727204294.16105: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 49915 1727204294.16114: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 49915 1727204294.16173: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 49915 1727204294.16206: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d07ff800> <<< 49915 1727204294.16209: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d07fe450> <<< 49915 1727204294.16262: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d07ba150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d07a0260> <<< 49915 1727204294.16279: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' <<< 49915 1727204294.16404: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d08308c0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d07a02f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7d0830d70> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d0830c20> <<< 49915 1727204294.16408: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7d0831010> <<< 49915 1727204294.16423: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d079ee10> <<< 49915 1727204294.16440: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 49915 1727204294.16451: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 49915 1727204294.16524: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d0831670> <<< 49915 1727204294.16527: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d0831370> import 'importlib.machinery' # <<< 49915 1727204294.16601: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d0832540> <<< 49915 1727204294.16604: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 49915 1727204294.16726: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 49915 1727204294.16756: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d0848740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 49915 1727204294.16763: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7d0849e20> <<< 49915 1727204294.16809: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 49915 1727204294.16814: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 49915 1727204294.16816: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 49915 1727204294.16827: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d084acc0> <<< 49915 1727204294.17250: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7d084b2f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d084a210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7d084bd70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d084b4a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d08324b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7d053bc80> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7d05646e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d0564440> <<< 49915 1727204294.17253: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7d0564560> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 49915 1727204294.17359: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 49915 1727204294.17742: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7d0565010> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7d0565a00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d05648c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d0539e20> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d0566e10> <<< 49915 1727204294.17750: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d0565b50> <<< 49915 1727204294.17753: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d0832c60> <<< 49915 1727204294.17772: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 49915 1727204294.17930: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d05931a0> <<< 49915 1727204294.17971: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 49915 1727204294.17992: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 49915 1727204294.18006: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 49915 1727204294.18035: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 49915 1727204294.18144: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d05b7530> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 49915 1727204294.18360: stdout chunk (state=3): >>>import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d06142f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 49915 1727204294.18498: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d0616a50> <<< 49915 1727204294.18501: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d0614410> <<< 49915 1727204294.18527: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d05dd2e0> <<< 49915 1727204294.18586: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cff253d0> <<< 49915 1727204294.18589: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d05b6330> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d0567d70> <<< 49915 1727204294.18790: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 49915 1727204294.18793: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fc7cff25640> <<< 49915 1727204294.19069: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_x_4kxrph/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 49915 1727204294.19336: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 49915 1727204294.19357: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 49915 1727204294.19408: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cff8b0e0> <<< 49915 1727204294.19499: stdout chunk (state=3): >>>import '_typing' # <<< 49915 1727204294.19582: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cff69fd0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cff69130> <<< 49915 1727204294.19589: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.19632: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 49915 1727204294.19716: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 49915 1727204294.21422: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.22217: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cff88f80> <<< 49915 1727204294.22573: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cffbea50> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cffbe7e0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cffbe0f0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cffbe540> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cff8bb60> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cffbf770> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cffbf950> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 49915 1727204294.22608: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 49915 1727204294.22707: stdout chunk (state=3): >>>import '_locale' # <<< 49915 1727204294.22725: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cffbfe60> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 49915 1727204294.22870: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfe25c10> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cfe27830> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 49915 1727204294.22884: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfe2c170> <<< 49915 1727204294.22896: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 49915 1727204294.22921: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 49915 1727204294.22944: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfe2d2e0> <<< 49915 1727204294.22955: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 49915 1727204294.23043: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 49915 1727204294.23046: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 49915 1727204294.23051: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 49915 1727204294.23338: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfe2fd70> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7d079ef00> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfe2e090> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 49915 1727204294.23455: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfe33d70> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfe32840> <<< 49915 1727204294.23461: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfe325a0> <<< 49915 1727204294.23553: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 49915 1727204294.23556: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfe32b10> <<< 49915 1727204294.23596: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfe2e570> <<< 49915 1727204294.23613: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cfe77a10> <<< 49915 1727204294.23637: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py <<< 49915 1727204294.23682: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfe78110> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 49915 1727204294.23782: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cfe79bb0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfe79970> <<< 49915 1727204294.23789: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 49915 1727204294.23806: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 49915 1727204294.23855: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cfe7c0e0> <<< 49915 1727204294.23930: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfe7a270> <<< 49915 1727204294.23933: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 49915 1727204294.24216: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfe7f8c0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfe7c290> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cfe80650> <<< 49915 1727204294.24297: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cfe80aa0> <<< 49915 1727204294.24301: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cfe80b60> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfe78230> <<< 49915 1727204294.24331: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 49915 1727204294.24334: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 49915 1727204294.24430: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cfd0c290> <<< 49915 1727204294.24569: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 49915 1727204294.24583: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cfd0d610> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfe82a20> <<< 49915 1727204294.24664: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cfe83dd0> <<< 49915 1727204294.24668: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfe82660> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # <<< 49915 1727204294.24746: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.25036: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available <<< 49915 1727204294.25456: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.26085: stdout chunk (state=3): >>># zipimport: zlib available<<< 49915 1727204294.26088: stdout chunk (state=3): >>> <<< 49915 1727204294.27024: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 49915 1727204294.27047: stdout chunk (state=3): >>> import 'ansible.module_utils.six.moves' # <<< 49915 1727204294.27070: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # <<< 49915 1727204294.27129: stdout chunk (state=3): >>> import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 49915 1727204294.27163: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc'<<< 49915 1727204294.27285: stdout chunk (state=3): >>> # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so'<<< 49915 1727204294.27293: stdout chunk (state=3): >>> # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so'<<< 49915 1727204294.27368: stdout chunk (state=3): >>> import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cfd11880> <<< 49915 1727204294.27425: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py<<< 49915 1727204294.27438: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 49915 1727204294.27505: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfd12660> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfd0d730><<< 49915 1727204294.27540: stdout chunk (state=3): >>> <<< 49915 1727204294.27605: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 49915 1727204294.27677: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # <<< 49915 1727204294.27759: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.28090: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.28138: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfd125a0> # zipimport: zlib available <<< 49915 1727204294.28576: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.29215: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available <<< 49915 1727204294.29255: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 49915 1727204294.29268: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.29337: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.29580: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available <<< 49915 1727204294.29644: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 49915 1727204294.29977: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.30389: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py<<< 49915 1727204294.30444: stdout chunk (state=3): >>> <<< 49915 1727204294.30496: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 49915 1727204294.30519: stdout chunk (state=3): >>>import '_ast' # <<< 49915 1727204294.30634: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfd138c0><<< 49915 1727204294.30678: stdout chunk (state=3): >>> <<< 49915 1727204294.30793: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.30814: stdout chunk (state=3): >>># zipimport: zlib available<<< 49915 1727204294.30842: stdout chunk (state=3): >>> <<< 49915 1727204294.30945: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # <<< 49915 1727204294.31003: stdout chunk (state=3): >>> import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 49915 1727204294.31127: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 49915 1727204294.31186: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 49915 1727204294.31208: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.31270: stdout chunk (state=3): >>># zipimport: zlib available<<< 49915 1727204294.31344: stdout chunk (state=3): >>> # zipimport: zlib available <<< 49915 1727204294.31442: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.31566: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 49915 1727204294.31647: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc'<<< 49915 1727204294.31740: stdout chunk (state=3): >>> <<< 49915 1727204294.31834: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 49915 1727204294.31837: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cfd1e1b0> <<< 49915 1727204294.31923: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfd19190><<< 49915 1727204294.31934: stdout chunk (state=3): >>> <<< 49915 1727204294.31967: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 49915 1727204294.31979: stdout chunk (state=3): >>> <<< 49915 1727204294.32019: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 49915 1727204294.32144: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.32226: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.32348: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 49915 1727204294.32351: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 49915 1727204294.32394: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 49915 1727204294.32439: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc'<<< 49915 1727204294.32544: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 49915 1727204294.32605: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 49915 1727204294.32644: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc'<<< 49915 1727204294.32742: stdout chunk (state=3): >>> import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfe06ba0> <<< 49915 1727204294.32822: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfefe870> <<< 49915 1727204294.32951: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfd1e3c0> <<< 49915 1727204294.32985: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfd13110><<< 49915 1727204294.33051: stdout chunk (state=3): >>> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available <<< 49915 1727204294.33088: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 49915 1727204294.33186: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # <<< 49915 1727204294.33217: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.33249: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.33261: stdout chunk (state=3): >>>import 'ansible.modules' # <<< 49915 1727204294.33306: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.33468: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.33820: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 49915 1727204294.33856: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 49915 1727204294.33872: stdout chunk (state=3): >>> <<< 49915 1727204294.33887: stdout chunk (state=3): >>># zipimport: zlib available<<< 49915 1727204294.34016: stdout chunk (state=3): >>> # zipimport: zlib available <<< 49915 1727204294.34147: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.34223: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.34280: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 49915 1727204294.34382: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.34591: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.34882: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.35065: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 49915 1727204294.35089: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 49915 1727204294.35103: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 49915 1727204294.35160: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfdb2480> <<< 49915 1727204294.35170: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 49915 1727204294.35196: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 49915 1727204294.35208: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 49915 1727204294.35322: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 49915 1727204294.35325: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 49915 1727204294.35327: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 49915 1727204294.35383: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cf9680e0> <<< 49915 1727204294.35392: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cf968410> <<< 49915 1727204294.35515: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfd9f1a0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfdb2fc0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfdb0b90> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfdb0830> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 49915 1727204294.35624: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 49915 1727204294.35627: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 49915 1727204294.35708: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 49915 1727204294.35748: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cf96b440> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cf96acf0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cf96aea0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cf96a120> <<< 49915 1727204294.35916: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 49915 1727204294.35931: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cf96b5f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 49915 1727204294.35957: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 49915 1727204294.35990: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cf9ce120> <<< 49915 1727204294.36047: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cf9cc140> <<< 49915 1727204294.36071: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfdb0d70> import 'ansible.module_utils.facts.timeout' # <<< 49915 1727204294.36102: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 49915 1727204294.36131: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.36186: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.36258: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available <<< 49915 1727204294.36307: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.36378: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 49915 1727204294.36400: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 49915 1727204294.36479: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 49915 1727204294.36541: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.36583: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 49915 1727204294.36586: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.36622: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.36693: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 49915 1727204294.36927: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 49915 1727204294.37027: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 49915 1727204294.37827: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.38574: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 49915 1727204294.38659: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.38696: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.38783: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.38910: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 49915 1727204294.38913: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 49915 1727204294.39006: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.39233: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 49915 1727204294.39294: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.39393: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 49915 1727204294.39448: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 49915 1727204294.39566: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cf9cf7d0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 49915 1727204294.39672: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cf9ceb40> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 49915 1727204294.39706: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.39753: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 49915 1727204294.40001: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.40492: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # <<< 49915 1727204294.40782: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cfa06390> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cf9f6f90> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 49915 1727204294.40901: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available <<< 49915 1727204294.40928: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.41044: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.41309: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 49915 1727204294.41343: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 49915 1727204294.41443: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 49915 1727204294.41482: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 49915 1727204294.41528: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cfa1dfa0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfa1f9b0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available <<< 49915 1727204294.41536: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 49915 1727204294.41625: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 49915 1727204294.41687: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 49915 1727204294.41903: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.41934: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 49915 1727204294.41948: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.42043: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.42144: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.42189: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.42305: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 49915 1727204294.42309: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 49915 1727204294.42441: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.42717: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 49915 1727204294.42783: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.42992: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 49915 1727204294.43700: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.44485: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 49915 1727204294.44515: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 49915 1727204294.44657: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.44831: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 49915 1727204294.44860: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.44978: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.45136: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 49915 1727204294.45158: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.45392: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.45664: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 49915 1727204294.45689: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 49915 1727204294.45721: stdout chunk (state=3): >>> <<< 49915 1727204294.45727: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 49915 1727204294.45814: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.45849: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 49915 1727204294.45870: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.45963: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.46057: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.46269: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.46466: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 49915 1727204294.46504: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.46520: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.46555: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 49915 1727204294.46583: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.46621: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.46624: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 49915 1727204294.47082: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 49915 1727204294.47148: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.47244: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 49915 1727204294.47276: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.47774: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.48267: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 49915 1727204294.48427: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # <<< 49915 1727204294.48488: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.48491: stdout chunk (state=3): >>># zipimport: zlib available<<< 49915 1727204294.48502: stdout chunk (state=3): >>> <<< 49915 1727204294.48549: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available<<< 49915 1727204294.48595: stdout chunk (state=3): >>> # zipimport: zlib available <<< 49915 1727204294.48687: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 49915 1727204294.48702: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.48793: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # <<< 49915 1727204294.48799: stdout chunk (state=3): >>> <<< 49915 1727204294.48855: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.48957: stdout chunk (state=3): >>># zipimport: zlib available<<< 49915 1727204294.49094: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.network.sunos' # <<< 49915 1727204294.49118: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.49162: stdout chunk (state=3): >>># zipimport: zlib available<<< 49915 1727204294.49166: stdout chunk (state=3): >>> <<< 49915 1727204294.49211: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual' # <<< 49915 1727204294.49214: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.49306: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.49377: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 49915 1727204294.49396: stdout chunk (state=3): >>> <<< 49915 1727204294.49422: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 49915 1727204294.49454: stdout chunk (state=3): >>># zipimport: zlib available<<< 49915 1727204294.49544: stdout chunk (state=3): >>> # zipimport: zlib available <<< 49915 1727204294.49633: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.49771: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.49909: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 49915 1727204294.49918: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # <<< 49915 1727204294.49942: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 49915 1727204294.50018: stdout chunk (state=3): >>># zipimport: zlib available<<< 49915 1727204294.50097: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.virtual.hpux' # <<< 49915 1727204294.50118: stdout chunk (state=3): >>># zipimport: zlib available<<< 49915 1727204294.50127: stdout chunk (state=3): >>> <<< 49915 1727204294.50552: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.50873: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available <<< 49915 1727204294.50999: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 49915 1727204294.51067: stdout chunk (state=3): >>># zipimport: zlib available<<< 49915 1727204294.51083: stdout chunk (state=3): >>> <<< 49915 1727204294.51150: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 49915 1727204294.51326: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 49915 1727204294.51329: stdout chunk (state=3): >>> <<< 49915 1727204294.51510: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 49915 1727204294.51589: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.51741: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 49915 1727204294.51766: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 49915 1727204294.51770: stdout chunk (state=3): >>> <<< 49915 1727204294.51893: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204294.52162: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py<<< 49915 1727204294.52169: stdout chunk (state=3): >>> <<< 49915 1727204294.52210: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py<<< 49915 1727204294.52215: stdout chunk (state=3): >>> <<< 49915 1727204294.52286: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so'<<< 49915 1727204294.52301: stdout chunk (state=3): >>> <<< 49915 1727204294.52305: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so'<<< 49915 1727204294.52310: stdout chunk (state=3): >>> <<< 49915 1727204294.52348: stdout chunk (state=3): >>>import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cf7b3050> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cf7b1bb0><<< 49915 1727204294.52357: stdout chunk (state=3): >>> <<< 49915 1727204294.52480: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cf7b1640> <<< 49915 1727204294.73269: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py<<< 49915 1727204294.73274: stdout chunk (state=3): >>> <<< 49915 1727204294.73318: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' <<< 49915 1727204294.73355: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cf7f86b0><<< 49915 1727204294.73360: stdout chunk (state=3): >>> <<< 49915 1727204294.73404: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py<<< 49915 1727204294.73409: stdout chunk (state=3): >>> <<< 49915 1727204294.73446: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc'<<< 49915 1727204294.73495: stdout chunk (state=3): >>> import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cf7f9670><<< 49915 1727204294.73502: stdout chunk (state=3): >>> <<< 49915 1727204294.73639: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc'<<< 49915 1727204294.73647: stdout chunk (state=3): >>> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py <<< 49915 1727204294.73663: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' <<< 49915 1727204294.73732: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cf7fb8f0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cf7fa570><<< 49915 1727204294.73754: stdout chunk (state=3): >>> <<< 49915 1727204294.74065: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 49915 1727204294.98983: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "58", "second": "14", "epoch": "1727204294", "epoch_int": "1727204294", "date": "2024-09-24", "time": "14:58:14", "iso8601_micro": "2024-09-24T18:58:14.530813Z", "iso8601": "2024-09-24T18:58:14Z", "iso8601_basic": "20240924T145814530813", "iso8601_basic_short": "20240924T145814", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releasele<<< 49915 1727204294.99032: stdout chunk (state=3): >>>vel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "ansible_loadavg": {"1m": 0.60888671875, "5m": 0.6630859375, "15m": 0.38232421875}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2900, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 631, "free": 2900}, "nocache": {"free": 3278, "used": 253}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 880, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261762297856, "block_size": 4096, "block_total": 65519099, "block_available": 63906811, "block_used": 1612288, "inode_total": 131070960, "inode_available": 131027120, "inode_used": 43840, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_iscsi_iqn": "", "ansible_pkg_mgr": "dnf", "ansible_lsb": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", <<< 49915 1727204294.99041: stdout chunk (state=3): >>>"tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "o<<< 49915 1727204294.99157: stdout chunk (state=3): >>>ff [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 49915 1727204295.00033: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ <<< 49915 1727204295.00070: stdout chunk (state=3): >>># clear sys.path # clear sys.argv # clear sys.ps1<<< 49915 1727204295.00088: stdout chunk (state=3): >>> # clear sys.ps2 # clear sys.last_exc<<< 49915 1727204295.00121: stdout chunk (state=3): >>> # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins<<< 49915 1727204295.00365: stdout chunk (state=3): >>> # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing<<< 49915 1727204295.00373: stdout chunk (state=3): >>> # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json <<< 49915 1727204295.00378: stdout chunk (state=3): >>># cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal<<< 49915 1727204295.00413: stdout chunk (state=3): >>> # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime<<< 49915 1727204295.00438: stdout chunk (state=3): >>> # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket<<< 49915 1727204295.00453: stdout chunk (state=3): >>> # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc<<< 49915 1727204295.00666: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4<<< 49915 1727204295.00670: stdout chunk (state=3): >>> # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro<<< 49915 1727204295.00673: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace<<< 49915 1727204295.00677: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ <<< 49915 1727204295.00680: stdout chunk (state=3): >>># cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter<<< 49915 1727204295.00682: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg<<< 49915 1727204295.00918: stdout chunk (state=3): >>> # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 49915 1727204295.01321: stdout chunk (state=3): >>># destroy _sitebuiltins<<< 49915 1727204295.01360: stdout chunk (state=3): >>> <<< 49915 1727204295.01370: stdout chunk (state=3): >>># destroy importlib.machinery <<< 49915 1727204295.01394: stdout chunk (state=3): >>># destroy importlib._abc # destroy importlib.util <<< 49915 1727204295.01466: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression <<< 49915 1727204295.01533: stdout chunk (state=3): >>># destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma <<< 49915 1727204295.01566: stdout chunk (state=3): >>># destroy zipfile._path # destroy zipfile<<< 49915 1727204295.01589: stdout chunk (state=3): >>> # destroy pathlib # destroy zipfile._path.glob <<< 49915 1727204295.01680: stdout chunk (state=3): >>># destroy ipaddress # destroy ntpath # destroy importlib<<< 49915 1727204295.01752: stdout chunk (state=3): >>> # destroy zipimport # destroy __main__<<< 49915 1727204295.01757: stdout chunk (state=3): >>> # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings <<< 49915 1727204295.01804: stdout chunk (state=3): >>># destroy _locale # destroy locale<<< 49915 1727204295.01808: stdout chunk (state=3): >>> # destroy select # destroy _signal # destroy _posixsubprocess<<< 49915 1727204295.01810: stdout chunk (state=3): >>> # destroy syslog<<< 49915 1727204295.01980: stdout chunk (state=3): >>> # destroy uuid <<< 49915 1727204295.02025: stdout chunk (state=3): >>># destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues<<< 49915 1727204295.02207: stdout chunk (state=3): >>> # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime <<< 49915 1727204295.02243: stdout chunk (state=3): >>># destroy subprocess # destroy base64<<< 49915 1727204295.02265: stdout chunk (state=3): >>> # destroy _ssl<<< 49915 1727204295.02332: stdout chunk (state=3): >>> # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd<<< 49915 1727204295.02352: stdout chunk (state=3): >>> # destroy termios # destroy json <<< 49915 1727204295.02419: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy fnmatch<<< 49915 1727204295.02506: stdout chunk (state=3): >>> # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process<<< 49915 1727204295.02551: stdout chunk (state=3): >>> # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna <<< 49915 1727204295.02783: stdout chunk (state=3): >>># destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader<<< 49915 1727204295.02787: stdout chunk (state=3): >>> # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum<<< 49915 1727204295.03081: stdout chunk (state=3): >>> # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat<<< 49915 1727204295.03084: stdout chunk (state=3): >>> # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs<<< 49915 1727204295.03117: stdout chunk (state=3): >>> # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 49915 1727204295.03210: stdout chunk (state=3): >>># destroy sys.monitoring<<< 49915 1727204295.03240: stdout chunk (state=3): >>> # destroy _socket <<< 49915 1727204295.03259: stdout chunk (state=3): >>># destroy _collections<<< 49915 1727204295.03281: stdout chunk (state=3): >>> <<< 49915 1727204295.03351: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser<<< 49915 1727204295.03411: stdout chunk (state=3): >>> # destroy tokenize # destroy ansible.module_utils.six.moves.urllib<<< 49915 1727204295.03425: stdout chunk (state=3): >>> # destroy copyreg # destroy contextlib<<< 49915 1727204295.03467: stdout chunk (state=3): >>> # destroy _typing<<< 49915 1727204295.03509: stdout chunk (state=3): >>> # destroy _tokenize<<< 49915 1727204295.03512: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools<<< 49915 1727204295.03553: stdout chunk (state=3): >>> # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external <<< 49915 1727204295.03611: stdout chunk (state=3): >>># destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules <<< 49915 1727204295.03750: stdout chunk (state=3): >>># destroy _frozen_importlib # destroy codecs<<< 49915 1727204295.03826: stdout chunk (state=3): >>> # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 <<< 49915 1727204295.03829: stdout chunk (state=3): >>># destroy encodings.idna<<< 49915 1727204295.03882: stdout chunk (state=3): >>> # destroy _codecs # destroy io <<< 49915 1727204295.03906: stdout chunk (state=3): >>># destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit<<< 49915 1727204295.03932: stdout chunk (state=3): >>> # destroy _warnings # destroy math # destroy _bisect # destroy time<<< 49915 1727204295.03968: stdout chunk (state=3): >>> # destroy _random<<< 49915 1727204295.03991: stdout chunk (state=3): >>> # destroy _weakref <<< 49915 1727204295.04027: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator<<< 49915 1727204295.04062: stdout chunk (state=3): >>> # destroy _sre # destroy _string # destroy re # destroy itertools<<< 49915 1727204295.04072: stdout chunk (state=3): >>> <<< 49915 1727204295.04105: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread<<< 49915 1727204295.04129: stdout chunk (state=3): >>> # clear sys.audit hooks<<< 49915 1727204295.04141: stdout chunk (state=3): >>> <<< 49915 1727204295.04769: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 49915 1727204295.04772: stdout chunk (state=3): >>><<< 49915 1727204295.04774: stderr chunk (state=3): >>><<< 49915 1727204295.04949: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d09b44d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d0983b00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d09b6a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d0765130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d0766060> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d07a3ec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d07a3f80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d07db8c0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d07dbf50> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d07bbb90> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d07b92b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d07a1070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d07ff800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d07fe450> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d07ba150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d07a0260> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d08308c0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d07a02f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7d0830d70> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d0830c20> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7d0831010> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d079ee10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d0831670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d0831370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d0832540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d0848740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7d0849e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d084acc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7d084b2f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d084a210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7d084bd70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d084b4a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d08324b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7d053bc80> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7d05646e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d0564440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7d0564560> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7d0565010> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7d0565a00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d05648c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d0539e20> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d0566e10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d0565b50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d0832c60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d05931a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d05b7530> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d06142f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d0616a50> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d0614410> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d05dd2e0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cff253d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d05b6330> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7d0567d70> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fc7cff25640> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_x_4kxrph/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cff8b0e0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cff69fd0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cff69130> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cff88f80> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cffbea50> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cffbe7e0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cffbe0f0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cffbe540> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cff8bb60> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cffbf770> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cffbf950> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cffbfe60> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfe25c10> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cfe27830> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfe2c170> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfe2d2e0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfe2fd70> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7d079ef00> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfe2e090> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfe33d70> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfe32840> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfe325a0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfe32b10> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfe2e570> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cfe77a10> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfe78110> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cfe79bb0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfe79970> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cfe7c0e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfe7a270> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfe7f8c0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfe7c290> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cfe80650> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cfe80aa0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cfe80b60> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfe78230> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cfd0c290> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cfd0d610> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfe82a20> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cfe83dd0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfe82660> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cfd11880> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfd12660> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfd0d730> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfd125a0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfd138c0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cfd1e1b0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfd19190> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfe06ba0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfefe870> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfd1e3c0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfd13110> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfdb2480> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cf9680e0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cf968410> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfd9f1a0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfdb2fc0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfdb0b90> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfdb0830> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cf96b440> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cf96acf0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cf96aea0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cf96a120> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cf96b5f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cf9ce120> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cf9cc140> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfdb0d70> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cf9cf7d0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cf9ceb40> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cfa06390> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cf9f6f90> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cfa1dfa0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cfa1f9b0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc7cf7b3050> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cf7b1bb0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cf7b1640> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cf7f86b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cf7f9670> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cf7fb8f0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc7cf7fa570> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "58", "second": "14", "epoch": "1727204294", "epoch_int": "1727204294", "date": "2024-09-24", "time": "14:58:14", "iso8601_micro": "2024-09-24T18:58:14.530813Z", "iso8601": "2024-09-24T18:58:14Z", "iso8601_basic": "20240924T145814530813", "iso8601_basic_short": "20240924T145814", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "ansible_loadavg": {"1m": 0.60888671875, "5m": 0.6630859375, "15m": 0.38232421875}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2900, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 631, "free": 2900}, "nocache": {"free": 3278, "used": 253}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 880, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261762297856, "block_size": 4096, "block_total": 65519099, "block_available": 63906811, "block_used": 1612288, "inode_total": 131070960, "inode_available": 131027120, "inode_used": 43840, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_iscsi_iqn": "", "ansible_pkg_mgr": "dnf", "ansible_lsb": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed-node2 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 49915 1727204295.06886: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204293.404674-50017-136991992923155/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49915 1727204295.06890: _low_level_execute_command(): starting 49915 1727204295.06893: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204293.404674-50017-136991992923155/ > /dev/null 2>&1 && sleep 0' 49915 1727204295.07083: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204295.07086: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204295.07088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204295.07090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204295.07093: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 49915 1727204295.07095: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204295.07097: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204295.07104: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204295.07124: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204295.07235: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204295.10039: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204295.10042: stderr chunk (state=3): >>><<< 49915 1727204295.10045: stdout chunk (state=3): >>><<< 49915 1727204295.10047: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204295.10065: handler run complete 49915 1727204295.10310: variable 'ansible_facts' from source: unknown 49915 1727204295.10523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204295.12947: variable 'ansible_facts' from source: unknown 49915 1727204295.13146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204295.13380: attempt loop complete, returning result 49915 1727204295.13383: _execute() done 49915 1727204295.13386: dumping result to json 49915 1727204295.13534: done dumping result, returning 49915 1727204295.13682: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [028d2410-947f-dcd7-b5af-0000000000af] 49915 1727204295.13686: sending task result for task 028d2410-947f-dcd7-b5af-0000000000af 49915 1727204295.14181: done sending task result for task 028d2410-947f-dcd7-b5af-0000000000af 49915 1727204295.14184: WORKER PROCESS EXITING ok: [managed-node2] 49915 1727204295.14664: no more pending results, returning what we have 49915 1727204295.14667: results queue empty 49915 1727204295.14667: checking for any_errors_fatal 49915 1727204295.14669: done checking for any_errors_fatal 49915 1727204295.14669: checking for max_fail_percentage 49915 1727204295.14671: done checking for max_fail_percentage 49915 1727204295.14672: checking to see if all hosts have failed and the running result is not ok 49915 1727204295.14672: done checking to see if all hosts have failed 49915 1727204295.14673: getting the remaining hosts for this loop 49915 1727204295.14675: done getting the remaining hosts for this loop 49915 1727204295.14883: getting the next task for host managed-node2 49915 1727204295.14889: done getting next task for host managed-node2 49915 1727204295.14891: ^ task is: TASK: meta (flush_handlers) 49915 1727204295.14893: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204295.14897: getting variables 49915 1727204295.14898: in VariableManager get_vars() 49915 1727204295.14924: Calling all_inventory to load vars for managed-node2 49915 1727204295.14927: Calling groups_inventory to load vars for managed-node2 49915 1727204295.14930: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204295.14941: Calling all_plugins_play to load vars for managed-node2 49915 1727204295.14944: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204295.14947: Calling groups_plugins_play to load vars for managed-node2 49915 1727204295.15246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204295.15900: done with get_vars() 49915 1727204295.16029: done getting variables 49915 1727204295.16098: in VariableManager get_vars() 49915 1727204295.16107: Calling all_inventory to load vars for managed-node2 49915 1727204295.16110: Calling groups_inventory to load vars for managed-node2 49915 1727204295.16115: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204295.16120: Calling all_plugins_play to load vars for managed-node2 49915 1727204295.16179: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204295.16184: Calling groups_plugins_play to load vars for managed-node2 49915 1727204295.16438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204295.16855: done with get_vars() 49915 1727204295.17186: done queuing things up, now waiting for results queue to drain 49915 1727204295.17188: results queue empty 49915 1727204295.17189: checking for any_errors_fatal 49915 1727204295.17192: done checking for any_errors_fatal 49915 1727204295.17197: checking for max_fail_percentage 49915 1727204295.17198: done checking for max_fail_percentage 49915 1727204295.17199: checking to see if all hosts have failed and the running result is not ok 49915 1727204295.17200: done checking to see if all hosts have failed 49915 1727204295.17200: getting the remaining hosts for this loop 49915 1727204295.17201: done getting the remaining hosts for this loop 49915 1727204295.17204: getting the next task for host managed-node2 49915 1727204295.17208: done getting next task for host managed-node2 49915 1727204295.17211: ^ task is: TASK: Include the task 'el_repo_setup.yml' 49915 1727204295.17214: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204295.17217: getting variables 49915 1727204295.17218: in VariableManager get_vars() 49915 1727204295.17226: Calling all_inventory to load vars for managed-node2 49915 1727204295.17228: Calling groups_inventory to load vars for managed-node2 49915 1727204295.17230: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204295.17234: Calling all_plugins_play to load vars for managed-node2 49915 1727204295.17235: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204295.17238: Calling groups_plugins_play to load vars for managed-node2 49915 1727204295.17357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204295.17942: done with get_vars() 49915 1727204295.17951: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_vlan_mtu_nm.yml:11 Tuesday 24 September 2024 14:58:15 -0400 (0:00:01.863) 0:00:01.888 ***** 49915 1727204295.18237: entering _queue_task() for managed-node2/include_tasks 49915 1727204295.18239: Creating lock for include_tasks 49915 1727204295.19208: worker is 1 (out of 1 available) 49915 1727204295.19221: exiting _queue_task() for managed-node2/include_tasks 49915 1727204295.19230: done queuing things up, now waiting for results queue to drain 49915 1727204295.19232: waiting for pending results... 49915 1727204295.19898: running TaskExecutor() for managed-node2/TASK: Include the task 'el_repo_setup.yml' 49915 1727204295.19904: in run() - task 028d2410-947f-dcd7-b5af-000000000006 49915 1727204295.19908: variable 'ansible_search_path' from source: unknown 49915 1727204295.19911: calling self._execute() 49915 1727204295.19987: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204295.20186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204295.20189: variable 'omit' from source: magic vars 49915 1727204295.20330: _execute() done 49915 1727204295.20338: dumping result to json 49915 1727204295.20346: done dumping result, returning 49915 1727204295.20357: done running TaskExecutor() for managed-node2/TASK: Include the task 'el_repo_setup.yml' [028d2410-947f-dcd7-b5af-000000000006] 49915 1727204295.20367: sending task result for task 028d2410-947f-dcd7-b5af-000000000006 49915 1727204295.20573: no more pending results, returning what we have 49915 1727204295.20582: in VariableManager get_vars() 49915 1727204295.20621: Calling all_inventory to load vars for managed-node2 49915 1727204295.20625: Calling groups_inventory to load vars for managed-node2 49915 1727204295.20629: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204295.20643: Calling all_plugins_play to load vars for managed-node2 49915 1727204295.20646: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204295.20650: Calling groups_plugins_play to load vars for managed-node2 49915 1727204295.21200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204295.21615: done with get_vars() 49915 1727204295.21623: variable 'ansible_search_path' from source: unknown 49915 1727204295.21885: we have included files to process 49915 1727204295.21886: generating all_blocks data 49915 1727204295.21888: done generating all_blocks data 49915 1727204295.21889: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 49915 1727204295.21890: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 49915 1727204295.21893: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 49915 1727204295.22383: done sending task result for task 028d2410-947f-dcd7-b5af-000000000006 49915 1727204295.22386: WORKER PROCESS EXITING 49915 1727204295.22997: in VariableManager get_vars() 49915 1727204295.23011: done with get_vars() 49915 1727204295.23025: done processing included file 49915 1727204295.23027: iterating over new_blocks loaded from include file 49915 1727204295.23029: in VariableManager get_vars() 49915 1727204295.23037: done with get_vars() 49915 1727204295.23039: filtering new block on tags 49915 1727204295.23052: done filtering new block on tags 49915 1727204295.23055: in VariableManager get_vars() 49915 1727204295.23064: done with get_vars() 49915 1727204295.23066: filtering new block on tags 49915 1727204295.23284: done filtering new block on tags 49915 1727204295.23287: in VariableManager get_vars() 49915 1727204295.23298: done with get_vars() 49915 1727204295.23299: filtering new block on tags 49915 1727204295.23316: done filtering new block on tags 49915 1727204295.23318: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node2 49915 1727204295.23324: extending task lists for all hosts with included blocks 49915 1727204295.23372: done extending task lists 49915 1727204295.23373: done processing included files 49915 1727204295.23374: results queue empty 49915 1727204295.23375: checking for any_errors_fatal 49915 1727204295.23378: done checking for any_errors_fatal 49915 1727204295.23379: checking for max_fail_percentage 49915 1727204295.23380: done checking for max_fail_percentage 49915 1727204295.23380: checking to see if all hosts have failed and the running result is not ok 49915 1727204295.23381: done checking to see if all hosts have failed 49915 1727204295.23382: getting the remaining hosts for this loop 49915 1727204295.23383: done getting the remaining hosts for this loop 49915 1727204295.23385: getting the next task for host managed-node2 49915 1727204295.23390: done getting next task for host managed-node2 49915 1727204295.23392: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 49915 1727204295.23394: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204295.23396: getting variables 49915 1727204295.23398: in VariableManager get_vars() 49915 1727204295.23405: Calling all_inventory to load vars for managed-node2 49915 1727204295.23407: Calling groups_inventory to load vars for managed-node2 49915 1727204295.23409: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204295.23417: Calling all_plugins_play to load vars for managed-node2 49915 1727204295.23419: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204295.23422: Calling groups_plugins_play to load vars for managed-node2 49915 1727204295.23759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204295.24140: done with get_vars() 49915 1727204295.24148: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 14:58:15 -0400 (0:00:00.059) 0:00:01.948 ***** 49915 1727204295.24210: entering _queue_task() for managed-node2/setup 49915 1727204295.25113: worker is 1 (out of 1 available) 49915 1727204295.25122: exiting _queue_task() for managed-node2/setup 49915 1727204295.25130: done queuing things up, now waiting for results queue to drain 49915 1727204295.25131: waiting for pending results... 49915 1727204295.25315: running TaskExecutor() for managed-node2/TASK: Gather the minimum subset of ansible_facts required by the network role test 49915 1727204295.25456: in run() - task 028d2410-947f-dcd7-b5af-0000000000c0 49915 1727204295.25608: variable 'ansible_search_path' from source: unknown 49915 1727204295.25620: variable 'ansible_search_path' from source: unknown 49915 1727204295.25664: calling self._execute() 49915 1727204295.25795: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204295.25810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204295.25826: variable 'omit' from source: magic vars 49915 1727204295.26405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49915 1727204295.28822: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49915 1727204295.28895: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49915 1727204295.29282: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49915 1727204295.29286: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49915 1727204295.29288: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49915 1727204295.29359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204295.29399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204295.29440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204295.29562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204295.29681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204295.29957: variable 'ansible_facts' from source: unknown 49915 1727204295.30055: variable 'network_test_required_facts' from source: task vars 49915 1727204295.30107: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 49915 1727204295.30122: variable 'omit' from source: magic vars 49915 1727204295.30184: variable 'omit' from source: magic vars 49915 1727204295.30229: variable 'omit' from source: magic vars 49915 1727204295.30279: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204295.30316: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204295.30341: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204295.30362: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204295.30380: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204295.30421: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204295.30430: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204295.30442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204295.30555: Set connection var ansible_connection to ssh 49915 1727204295.30608: Set connection var ansible_shell_type to sh 49915 1727204295.30611: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204295.30616: Set connection var ansible_shell_executable to /bin/sh 49915 1727204295.30619: Set connection var ansible_timeout to 10 49915 1727204295.30620: Set connection var ansible_pipelining to False 49915 1727204295.30641: variable 'ansible_shell_executable' from source: unknown 49915 1727204295.30657: variable 'ansible_connection' from source: unknown 49915 1727204295.30666: variable 'ansible_module_compression' from source: unknown 49915 1727204295.30673: variable 'ansible_shell_type' from source: unknown 49915 1727204295.30682: variable 'ansible_shell_executable' from source: unknown 49915 1727204295.30689: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204295.30719: variable 'ansible_pipelining' from source: unknown 49915 1727204295.30722: variable 'ansible_timeout' from source: unknown 49915 1727204295.30724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204295.30897: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 49915 1727204295.30936: variable 'omit' from source: magic vars 49915 1727204295.30940: starting attempt loop 49915 1727204295.30942: running the handler 49915 1727204295.30962: _low_level_execute_command(): starting 49915 1727204295.30978: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49915 1727204295.31700: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 49915 1727204295.31705: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 49915 1727204295.31746: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204295.31780: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204295.31796: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204295.31889: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 49915 1727204295.34103: stdout chunk (state=3): >>>/root <<< 49915 1727204295.34107: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204295.34305: stderr chunk (state=3): >>><<< 49915 1727204295.34311: stdout chunk (state=3): >>><<< 49915 1727204295.34314: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 49915 1727204295.34324: _low_level_execute_command(): starting 49915 1727204295.34328: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204295.3422487-50116-164785714489511 `" && echo ansible-tmp-1727204295.3422487-50116-164785714489511="` echo /root/.ansible/tmp/ansible-tmp-1727204295.3422487-50116-164785714489511 `" ) && sleep 0' 49915 1727204295.34962: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204295.34980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204295.35011: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204295.35064: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204295.35085: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204295.35188: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 49915 1727204295.37500: stdout chunk (state=3): >>>ansible-tmp-1727204295.3422487-50116-164785714489511=/root/.ansible/tmp/ansible-tmp-1727204295.3422487-50116-164785714489511 <<< 49915 1727204295.37503: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204295.37506: stdout chunk (state=3): >>><<< 49915 1727204295.37508: stderr chunk (state=3): >>><<< 49915 1727204295.37510: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204295.3422487-50116-164785714489511=/root/.ansible/tmp/ansible-tmp-1727204295.3422487-50116-164785714489511 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 49915 1727204295.37512: variable 'ansible_module_compression' from source: unknown 49915 1727204295.37514: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49915ogiz3nec/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 49915 1727204295.37516: variable 'ansible_facts' from source: unknown 49915 1727204295.37828: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204295.3422487-50116-164785714489511/AnsiballZ_setup.py 49915 1727204295.38202: Sending initial data 49915 1727204295.38226: Sent initial data (154 bytes) 49915 1727204295.39198: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204295.39223: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204295.39252: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204295.39346: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204295.41119: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49915 1727204295.41198: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49915 1727204295.41318: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49915ogiz3nec/tmp0bwxgssn /root/.ansible/tmp/ansible-tmp-1727204295.3422487-50116-164785714489511/AnsiballZ_setup.py <<< 49915 1727204295.41329: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204295.3422487-50116-164785714489511/AnsiballZ_setup.py" <<< 49915 1727204295.41535: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49915ogiz3nec/tmp0bwxgssn" to remote "/root/.ansible/tmp/ansible-tmp-1727204295.3422487-50116-164785714489511/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204295.3422487-50116-164785714489511/AnsiballZ_setup.py" <<< 49915 1727204295.45654: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204295.45736: stderr chunk (state=3): >>><<< 49915 1727204295.45793: stdout chunk (state=3): >>><<< 49915 1727204295.45863: done transferring module to remote 49915 1727204295.46064: _low_level_execute_command(): starting 49915 1727204295.46068: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204295.3422487-50116-164785714489511/ /root/.ansible/tmp/ansible-tmp-1727204295.3422487-50116-164785714489511/AnsiballZ_setup.py && sleep 0' 49915 1727204295.47221: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204295.47365: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204295.47393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 49915 1727204295.47440: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204295.47490: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204295.47520: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204295.47543: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204295.47651: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204295.49589: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204295.49593: stdout chunk (state=3): >>><<< 49915 1727204295.49728: stderr chunk (state=3): >>><<< 49915 1727204295.49734: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204295.49736: _low_level_execute_command(): starting 49915 1727204295.49741: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204295.3422487-50116-164785714489511/AnsiballZ_setup.py && sleep 0' 49915 1727204295.50424: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204295.50490: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204295.50577: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204295.50601: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204295.50661: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204295.50810: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204295.53005: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 49915 1727204295.53025: stdout chunk (state=3): >>>import _imp # builtin <<< 49915 1727204295.53069: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # <<< 49915 1727204295.53088: stdout chunk (state=3): >>>import '_weakref' # <<< 49915 1727204295.53148: stdout chunk (state=3): >>>import '_io' # <<< 49915 1727204295.53151: stdout chunk (state=3): >>>import 'marshal' # <<< 49915 1727204295.53180: stdout chunk (state=3): >>>import 'posix' # <<< 49915 1727204295.53221: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 49915 1727204295.53249: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 49915 1727204295.53307: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 49915 1727204295.53324: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 49915 1727204295.53353: stdout chunk (state=3): >>>import 'codecs' # <<< 49915 1727204295.53399: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 49915 1727204295.53462: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a856e84d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a856b7b30> <<< 49915 1727204295.53485: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a856eaa50> <<< 49915 1727204295.53519: stdout chunk (state=3): >>>import '_signal' # <<< 49915 1727204295.53537: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 49915 1727204295.53570: stdout chunk (state=3): >>>import 'io' # <<< 49915 1727204295.53584: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 49915 1727204295.53660: stdout chunk (state=3): >>>import '_collections_abc' # <<< 49915 1727204295.53696: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 49915 1727204295.53724: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # <<< 49915 1727204295.53768: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' <<< 49915 1727204295.53799: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 49915 1727204295.53834: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 49915 1727204295.53843: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a85499130> <<< 49915 1727204295.53929: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 49915 1727204295.53932: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a8549a060> <<< 49915 1727204295.53947: stdout chunk (state=3): >>>import 'site' # <<< 49915 1727204295.54019: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 49915 1727204295.54388: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 49915 1727204295.54406: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 49915 1727204295.54413: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 49915 1727204295.54423: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 49915 1727204295.54703: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a854d7f80> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a854ec110> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 49915 1727204295.54760: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # <<< 49915 1727204295.54784: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 49915 1727204295.54815: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' <<< 49915 1727204295.54819: stdout chunk (state=3): >>>import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a8550f9b0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 49915 1727204295.54834: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a8550ff80> <<< 49915 1727204295.54879: stdout chunk (state=3): >>>import '_collections' # <<< 49915 1727204295.54919: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a854efc50> <<< 49915 1727204295.54962: stdout chunk (state=3): >>>import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a854ed370> <<< 49915 1727204295.55100: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a854d5130> <<< 49915 1727204295.55125: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 49915 1727204295.55153: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 49915 1727204295.55198: stdout chunk (state=3): >>>import '_sre' # <<< 49915 1727204295.55208: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 49915 1727204295.55247: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 49915 1727204295.55260: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 49915 1727204295.55294: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a8552f8f0> <<< 49915 1727204295.55353: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a8552e510> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 49915 1727204295.55361: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a854ee210> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a8552cda0> <<< 49915 1727204295.55528: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a855609b0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a854d43b0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a85560e60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a85560d10> <<< 49915 1727204295.55551: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a85561100> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a854d2ed0> <<< 49915 1727204295.55595: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 49915 1727204295.55633: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 49915 1727204295.55668: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a855617f0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a855614c0> import 'importlib.machinery' # <<< 49915 1727204295.55741: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a855626f0> <<< 49915 1727204295.55755: stdout chunk (state=3): >>>import 'importlib.util' # <<< 49915 1727204295.55760: stdout chunk (state=3): >>>import 'runpy' # <<< 49915 1727204295.55837: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 49915 1727204295.55878: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 49915 1727204295.55893: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a8557c8c0> <<< 49915 1727204295.55929: stdout chunk (state=3): >>>import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 49915 1727204295.55932: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a8557e000> <<< 49915 1727204295.55962: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 49915 1727204295.55969: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 49915 1727204295.56010: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 49915 1727204295.56088: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a8557ee40> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a8557f4a0> <<< 49915 1727204295.56119: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a8557e390> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 49915 1727204295.56181: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a8557ff20> <<< 49915 1727204295.56198: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a8557f650> <<< 49915 1727204295.56265: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a85560da0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 49915 1727204295.56290: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 49915 1727204295.56346: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 49915 1727204295.56378: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 49915 1727204295.56392: stdout chunk (state=3): >>>import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a8527fd70> <<< 49915 1727204295.56443: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py <<< 49915 1727204295.56452: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a852a8890> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a852a85f0> <<< 49915 1727204295.56479: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 49915 1727204295.56485: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a852a87d0> <<< 49915 1727204295.56548: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 49915 1727204295.56616: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 49915 1727204295.56844: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a852a9160> <<< 49915 1727204295.56967: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a852a9a60> <<< 49915 1727204295.57030: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a852a8a40> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a8527df10> <<< 49915 1727204295.57051: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 49915 1727204295.57058: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 49915 1727204295.57089: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 49915 1727204295.57141: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a852aae70> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a852a9bb0> <<< 49915 1727204295.57157: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a85562de0> <<< 49915 1727204295.57189: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 49915 1727204295.57281: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 49915 1727204295.57350: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 49915 1727204295.57373: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a852d31d0> <<< 49915 1727204295.57434: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 49915 1727204295.57464: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 49915 1727204295.57479: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 49915 1727204295.57509: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 49915 1727204295.57561: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a852fb590> <<< 49915 1727204295.57594: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 49915 1727204295.57636: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 49915 1727204295.57720: stdout chunk (state=3): >>>import 'ntpath' # <<< 49915 1727204295.57746: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a853582f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 49915 1727204295.57796: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 49915 1727204295.57861: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 49915 1727204295.58103: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a8535aa50> <<< 49915 1727204295.58286: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a85358410> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a853213a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a851613d0> <<< 49915 1727204295.58350: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a852fa390> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a852abda0> <<< 49915 1727204295.58372: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f3a852fa6f0> <<< 49915 1727204295.58729: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_7r0ecd3h/ansible_setup_payload.zip' # zipimport: zlib available <<< 49915 1727204295.58850: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.58853: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 49915 1727204295.58881: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 49915 1727204295.58910: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 49915 1727204295.58999: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 49915 1727204295.59019: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a851cb0b0> <<< 49915 1727204295.59040: stdout chunk (state=3): >>>import '_typing' # <<< 49915 1727204295.59209: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a851a9fa0> <<< 49915 1727204295.59223: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a851a9100> # zipimport: zlib available <<< 49915 1727204295.59282: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 49915 1727204295.59312: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 49915 1727204295.61546: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.62936: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a851c8f50> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 49915 1727204295.62982: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 49915 1727204295.62997: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a851feab0> <<< 49915 1727204295.63020: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a851fe840> <<< 49915 1727204295.63109: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a851fe150> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 49915 1727204295.63121: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a851fe5a0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a851cbad0> import 'atexit' # <<< 49915 1727204295.63191: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a851ff830> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a851ffa70> <<< 49915 1727204295.63211: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 49915 1727204295.63684: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a851fffb0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b2dd30> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a84b2f950> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b34350> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b354f0> <<< 49915 1727204295.63719: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 49915 1727204295.63764: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b37fb0> <<< 49915 1727204295.63845: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a84b380e0> <<< 49915 1727204295.63848: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b36270> <<< 49915 1727204295.63909: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 49915 1727204295.63928: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 49915 1727204295.63950: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 49915 1727204295.64225: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b3bef0> import '_tokenize' # <<< 49915 1727204295.64269: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b3a9c0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b3a720> <<< 49915 1727204295.64293: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc'<<< 49915 1727204295.64345: stdout chunk (state=3): >>> <<< 49915 1727204295.64414: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b3ac90><<< 49915 1727204295.64417: stdout chunk (state=3): >>> <<< 49915 1727204295.64459: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b36780><<< 49915 1727204295.64469: stdout chunk (state=3): >>> <<< 49915 1727204295.64502: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 49915 1727204295.64525: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so'<<< 49915 1727204295.64531: stdout chunk (state=3): >>> import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a84b806b0><<< 49915 1727204295.64537: stdout chunk (state=3): >>> <<< 49915 1727204295.64674: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b80320> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so'<<< 49915 1727204295.64682: stdout chunk (state=3): >>> <<< 49915 1727204295.64684: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so'<<< 49915 1727204295.64702: stdout chunk (state=3): >>> import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a84b81d90> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b81b50><<< 49915 1727204295.64726: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 49915 1727204295.64768: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc'<<< 49915 1727204295.64835: stdout chunk (state=3): >>> # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 49915 1727204295.64855: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 49915 1727204295.64870: stdout chunk (state=3): >>>import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a84b842f0> <<< 49915 1727204295.64897: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b82480> <<< 49915 1727204295.64987: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 49915 1727204295.65023: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 49915 1727204295.65046: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 49915 1727204295.65069: stdout chunk (state=3): >>>import '_string' # <<< 49915 1727204295.65326: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b87ad0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b844a0> <<< 49915 1727204295.65413: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 49915 1727204295.65438: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 49915 1727204295.65443: stdout chunk (state=3): >>>import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a84b888f0> <<< 49915 1727204295.65497: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 49915 1727204295.65505: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 49915 1727204295.65520: stdout chunk (state=3): >>>import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a84b85010> <<< 49915 1727204295.65593: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so'<<< 49915 1727204295.65597: stdout chunk (state=3): >>> # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so'<<< 49915 1727204295.65600: stdout chunk (state=3): >>> <<< 49915 1727204295.65616: stdout chunk (state=3): >>>import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a84b88c50> <<< 49915 1727204295.65640: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b804d0><<< 49915 1727204295.65643: stdout chunk (state=3): >>> <<< 49915 1727204295.65682: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py <<< 49915 1727204295.65691: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc'<<< 49915 1727204295.65699: stdout chunk (state=3): >>> <<< 49915 1727204295.65724: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py<<< 49915 1727204295.65729: stdout chunk (state=3): >>> <<< 49915 1727204295.65790: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc'<<< 49915 1727204295.65795: stdout chunk (state=3): >>> <<< 49915 1727204295.65837: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 49915 1727204295.65848: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so'<<< 49915 1727204295.66359: stdout chunk (state=3): >>> import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a84a14380> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a84a15640> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b8ab40> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a84b8bef0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b8a780> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # <<< 49915 1727204295.66362: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.66504: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.66639: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.66666: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.66682: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 49915 1727204295.66708: stdout chunk (state=3): >>> # zipimport: zlib available <<< 49915 1727204295.66745: stdout chunk (state=3): >>># zipimport: zlib available<<< 49915 1727204295.66771: stdout chunk (state=3): >>> import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 49915 1727204295.66961: stdout chunk (state=3): >>># zipimport: zlib available<<< 49915 1727204295.67145: stdout chunk (state=3): >>> <<< 49915 1727204295.67168: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.68047: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.68931: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 49915 1727204295.68994: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 49915 1727204295.68998: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # <<< 49915 1727204295.69026: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 49915 1727204295.69067: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 49915 1727204295.69133: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so'<<< 49915 1727204295.69176: stdout chunk (state=3): >>> # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a84a19820> <<< 49915 1727204295.69330: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 49915 1727204295.69392: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84a1a5d0> <<< 49915 1727204295.69538: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b8a9c0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available<<< 49915 1727204295.69746: stdout chunk (state=3): >>> <<< 49915 1727204295.69767: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.70007: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py<<< 49915 1727204295.70037: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 49915 1727204295.70089: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84a1a3f0> <<< 49915 1727204295.70186: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.70819: stdout chunk (state=3): >>># zipimport: zlib available<<< 49915 1727204295.70855: stdout chunk (state=3): >>> <<< 49915 1727204295.71556: stdout chunk (state=3): >>># zipimport: zlib available<<< 49915 1727204295.71579: stdout chunk (state=3): >>> <<< 49915 1727204295.71674: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.71863: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 49915 1727204295.71898: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # <<< 49915 1727204295.71912: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.72342: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 49915 1727204295.72632: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.73009: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 49915 1727204295.73103: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 49915 1727204295.73134: stdout chunk (state=3): >>>import '_ast' # <<< 49915 1727204295.73237: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84a1b830> <<< 49915 1727204295.73270: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.73391: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.73505: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 49915 1727204295.73536: stdout chunk (state=3): >>> import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 49915 1727204295.73546: stdout chunk (state=3): >>> import 'ansible.module_utils.common.arg_spec' # <<< 49915 1727204295.73633: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 49915 1727204295.73649: stdout chunk (state=3): >>> <<< 49915 1727204295.73690: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 49915 1727204295.73709: stdout chunk (state=3): >>> # zipimport: zlib available <<< 49915 1727204295.73784: stdout chunk (state=3): >>># zipimport: zlib available<<< 49915 1727204295.73796: stdout chunk (state=3): >>> <<< 49915 1727204295.73932: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 49915 1727204295.74037: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py<<< 49915 1727204295.74046: stdout chunk (state=3): >>> <<< 49915 1727204295.74107: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc'<<< 49915 1727204295.74117: stdout chunk (state=3): >>> <<< 49915 1727204295.74292: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a84a26240> <<< 49915 1727204295.74322: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84a21a60> <<< 49915 1727204295.74395: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 49915 1727204295.74541: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.74585: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.74684: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 49915 1727204295.74845: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 49915 1727204295.75000: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc'<<< 49915 1727204295.75003: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py<<< 49915 1727204295.75005: stdout chunk (state=3): >>> <<< 49915 1727204295.75007: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc'<<< 49915 1727204295.75314: stdout chunk (state=3): >>> import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b0eab0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a85226780> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84a260c0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b89b80> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 49915 1727204295.75371: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 49915 1727204295.75378: stdout chunk (state=3): >>> <<< 49915 1727204295.75383: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 49915 1727204295.75397: stdout chunk (state=3): >>> import 'ansible.module_utils.common.sys_info' # <<< 49915 1727204295.75478: stdout chunk (state=3): >>> import 'ansible.module_utils.basic' # <<< 49915 1727204295.75506: stdout chunk (state=3): >>> # zipimport: zlib available<<< 49915 1727204295.75531: stdout chunk (state=3): >>> # zipimport: zlib available<<< 49915 1727204295.75551: stdout chunk (state=3): >>> import 'ansible.modules' # <<< 49915 1727204295.75573: stdout chunk (state=3): >>># zipimport: zlib available<<< 49915 1727204295.75702: stdout chunk (state=3): >>> # zipimport: zlib available <<< 49915 1727204295.75766: stdout chunk (state=3): >>># zipimport: zlib available<<< 49915 1727204295.75797: stdout chunk (state=3): >>> # zipimport: zlib available <<< 49915 1727204295.75830: stdout chunk (state=3): >>># zipimport: zlib available<<< 49915 1727204295.75897: stdout chunk (state=3): >>> # zipimport: zlib available <<< 49915 1727204295.75981: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.76021: stdout chunk (state=3): >>># zipimport: zlib available<<< 49915 1727204295.76043: stdout chunk (state=3): >>> <<< 49915 1727204295.76095: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 49915 1727204295.76117: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.76329: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 49915 1727204295.76356: stdout chunk (state=3): >>> <<< 49915 1727204295.76383: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.76461: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 49915 1727204295.76465: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.76777: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.77013: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.77074: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.77148: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py<<< 49915 1727204295.77167: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 49915 1727204295.77223: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 49915 1727204295.77245: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py<<< 49915 1727204295.77292: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 49915 1727204295.77328: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84ab6630> <<< 49915 1727204295.77381: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 49915 1727204295.77410: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py<<< 49915 1727204295.77438: stdout chunk (state=3): >>> <<< 49915 1727204295.77487: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 49915 1727204295.77536: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 49915 1727204295.77594: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a846b4170> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 49915 1727204295.77622: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 49915 1727204295.77647: stdout chunk (state=3): >>>import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a846b44a0><<< 49915 1727204295.77710: stdout chunk (state=3): >>> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84aa3290><<< 49915 1727204295.77723: stdout chunk (state=3): >>> <<< 49915 1727204295.77745: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84ab71a0> <<< 49915 1727204295.77796: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84ab4cb0><<< 49915 1727204295.77817: stdout chunk (state=3): >>> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84ab48f0> <<< 49915 1727204295.77853: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py<<< 49915 1727204295.77865: stdout chunk (state=3): >>> <<< 49915 1727204295.77919: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc'<<< 49915 1727204295.77951: stdout chunk (state=3): >>> <<< 49915 1727204295.77975: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py<<< 49915 1727204295.77993: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 49915 1727204295.78030: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc'<<< 49915 1727204295.78056: stdout chunk (state=3): >>> <<< 49915 1727204295.78082: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so'<<< 49915 1727204295.78094: stdout chunk (state=3): >>> # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a846b73b0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a846b6c60><<< 49915 1727204295.78134: stdout chunk (state=3): >>> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' <<< 49915 1727204295.78156: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so'<<< 49915 1727204295.78179: stdout chunk (state=3): >>> import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a846b6e40> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a846b6090> <<< 49915 1727204295.78203: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py<<< 49915 1727204295.78374: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 49915 1727204295.78400: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a846b7500> <<< 49915 1727204295.78438: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 49915 1727204295.78490: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc'<<< 49915 1727204295.78499: stdout chunk (state=3): >>> <<< 49915 1727204295.78535: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' <<< 49915 1727204295.78563: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a84716000> <<< 49915 1727204295.78612: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a846b7f50><<< 49915 1727204295.78622: stdout chunk (state=3): >>> <<< 49915 1727204295.78670: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84ab49b0> <<< 49915 1727204295.78673: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.timeout' # <<< 49915 1727204295.78699: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.collector' # <<< 49915 1727204295.78777: stdout chunk (state=3): >>># zipimport: zlib available<<< 49915 1727204295.79137: stdout chunk (state=3): >>> # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 49915 1727204295.79140: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available <<< 49915 1727204295.79157: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 49915 1727204295.79181: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.79203: stdout chunk (state=3): >>># zipimport: zlib available<<< 49915 1727204295.79234: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.system' # # zipimport: zlib available<<< 49915 1727204295.79253: stdout chunk (state=3): >>> <<< 49915 1727204295.79281: stdout chunk (state=3): >>># zipimport: zlib available<<< 49915 1727204295.79323: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.system.apparmor' # <<< 49915 1727204295.79421: stdout chunk (state=3): >>> # zipimport: zlib available <<< 49915 1727204295.79446: stdout chunk (state=3): >>># zipimport: zlib available<<< 49915 1727204295.79491: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.system.caps' # <<< 49915 1727204295.79524: stdout chunk (state=3): >>> # zipimport: zlib available <<< 49915 1727204295.79584: stdout chunk (state=3): >>># zipimport: zlib available<<< 49915 1727204295.79649: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.system.chroot' # <<< 49915 1727204295.79697: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.79866: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 49915 1727204295.79937: stdout chunk (state=3): >>># zipimport: zlib available<<< 49915 1727204295.79954: stdout chunk (state=3): >>> <<< 49915 1727204295.80020: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 49915 1727204295.80059: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # <<< 49915 1727204295.80091: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.80837: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.81541: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 49915 1727204295.81568: stdout chunk (state=3): >>> # zipimport: zlib available <<< 49915 1727204295.81714: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 49915 1727204295.81767: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.81832: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 49915 1727204295.81838: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # <<< 49915 1727204295.81860: stdout chunk (state=3): >>> # zipimport: zlib available <<< 49915 1727204295.81902: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.81939: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 49915 1727204295.82152: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available <<< 49915 1727204295.82182: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 49915 1727204295.82200: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.82229: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.82278: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 49915 1727204295.82281: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.82387: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.82507: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 49915 1727204295.82525: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 49915 1727204295.82566: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84717c50> <<< 49915 1727204295.82607: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py<<< 49915 1727204295.82624: stdout chunk (state=3): >>> <<< 49915 1727204295.82686: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 49915 1727204295.82891: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84716e40> import 'ansible.module_utils.facts.system.local' # <<< 49915 1727204295.82956: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.83029: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.83319: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 49915 1727204295.83334: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 49915 1727204295.83374: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 49915 1727204295.83404: stdout chunk (state=3): >>> # zipimport: zlib available <<< 49915 1727204295.83510: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.83611: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 49915 1727204295.83768: stdout chunk (state=3): >>> # zipimport: zlib available # zipimport: zlib available <<< 49915 1727204295.83793: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 49915 1727204295.83855: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc'<<< 49915 1727204295.83878: stdout chunk (state=3): >>> <<< 49915 1727204295.83946: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so'<<< 49915 1727204295.83974: stdout chunk (state=3): >>> <<< 49915 1727204295.84045: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so'<<< 49915 1727204295.84074: stdout chunk (state=3): >>> import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a84752480><<< 49915 1727204295.84238: stdout chunk (state=3): >>> <<< 49915 1727204295.84408: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84743260> import 'ansible.module_utils.facts.system.python' # <<< 49915 1727204295.84514: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.84544: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.84581: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 49915 1727204295.84604: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.84745: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.85190: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 49915 1727204295.85291: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 49915 1727204295.85298: stdout chunk (state=3): >>> <<< 49915 1727204295.85303: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # <<< 49915 1727204295.85318: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.85391: stdout chunk (state=3): >>># zipimport: zlib available<<< 49915 1727204295.85394: stdout chunk (state=3): >>> <<< 49915 1727204295.85444: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 49915 1727204295.85448: stdout chunk (state=3): >>> <<< 49915 1727204295.85493: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.85702: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a8476a150><<< 49915 1727204295.85706: stdout chunk (state=3): >>> <<< 49915 1727204295.85727: stdout chunk (state=3): >>>import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84769e80> import 'ansible.module_utils.facts.system.user' # <<< 49915 1727204295.85738: stdout chunk (state=3): >>> # zipimport: zlib available<<< 49915 1727204295.85827: stdout chunk (state=3): >>> # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 49915 1727204295.85878: stdout chunk (state=3): >>># zipimport: zlib available<<< 49915 1727204295.85978: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 49915 1727204295.86197: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.86447: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 49915 1727204295.86451: stdout chunk (state=3): >>> # zipimport: zlib available <<< 49915 1727204295.86636: stdout chunk (state=3): >>># zipimport: zlib available<<< 49915 1727204295.86760: stdout chunk (state=3): >>> # zipimport: zlib available <<< 49915 1727204295.86824: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.86945: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 49915 1727204295.86971: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available <<< 49915 1727204295.87000: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.87302: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.87436: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # <<< 49915 1727204295.87459: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 49915 1727204295.87543: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.87667: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.87867: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 49915 1727204295.88092: stdout chunk (state=3): >>> # zipimport: zlib available <<< 49915 1727204295.88121: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 49915 1727204295.88832: stdout chunk (state=3): >>># zipimport: zlib available<<< 49915 1727204295.88925: stdout chunk (state=3): >>> <<< 49915 1727204295.89718: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available<<< 49915 1727204295.89744: stdout chunk (state=3): >>> <<< 49915 1727204295.89818: stdout chunk (state=3): >>># zipimport: zlib available<<< 49915 1727204295.89897: stdout chunk (state=3): >>> <<< 49915 1727204295.90019: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 49915 1727204295.90147: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.90331: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 49915 1727204295.90349: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.90665: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.90793: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 49915 1727204295.90824: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 49915 1727204295.90905: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 49915 1727204295.91001: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 49915 1727204295.91020: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.91133: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.91332: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.91669: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.91940: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 49915 1727204295.91962: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available <<< 49915 1727204295.92056: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available <<< 49915 1727204295.92117: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.92157: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 49915 1727204295.92190: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.92241: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.92331: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available <<< 49915 1727204295.92435: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 49915 1727204295.92519: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.92619: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 49915 1727204295.92735: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # <<< 49915 1727204295.92753: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.93200: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.93661: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 49915 1727204295.93727: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.93857: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available <<< 49915 1727204295.93952: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.93956: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available <<< 49915 1727204295.94089: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available <<< 49915 1727204295.94172: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 49915 1727204295.94243: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.94341: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 49915 1727204295.94383: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 49915 1727204295.94549: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available <<< 49915 1727204295.94553: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.94556: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.94635: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.94752: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.94826: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.94942: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 49915 1727204295.94954: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 49915 1727204295.95008: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.95096: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 49915 1727204295.95692: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.95724: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available <<< 49915 1727204295.95772: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.95851: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 49915 1727204295.95894: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.95934: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.96000: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 49915 1727204295.96004: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.96143: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.96269: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 49915 1727204295.96273: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.96382: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.96539: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 49915 1727204295.96625: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204295.96908: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 49915 1727204295.96958: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 49915 1727204295.96969: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a84566c00> <<< 49915 1727204295.96995: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a845640b0> <<< 49915 1727204295.97087: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a8455e750> <<< 49915 1727204295.98613: stdout chunk (state=3): >>> <<< 49915 1727204295.98659: stdout chunk (state=3): >>>{"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "58", "s<<< 49915 1727204295.98694: stdout chunk (state=3): >>>econd": "15", "epoch": "1727204295", "epoch_int": "1727204295", "date": "2024-09-24", "time": "14:58:15", "iso8601_micro": "2024-09-24T18:58:15.973246Z", "iso8601": "2024-09-24T18:58:15Z", "iso8601_basic": "20240924T145815973246", "iso8601_basic_short": "20240924T145815", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 49915 1727204295.99447: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 49915 1727204295.99499: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc<<< 49915 1727204295.99503: stdout chunk (state=3): >>> # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout<<< 49915 1727204295.99538: stdout chunk (state=3): >>> # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs<<< 49915 1727204295.99565: stdout chunk (state=3): >>> # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat<<< 49915 1727204295.99593: stdout chunk (state=3): >>> # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site<<< 49915 1727204295.99611: stdout chunk (state=3): >>> # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix<<< 49915 1727204295.99645: stdout chunk (state=3): >>> # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings <<< 49915 1727204295.99661: stdout chunk (state=3): >>># cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib<<< 49915 1727204295.99696: stdout chunk (state=3): >>> # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile<<< 49915 1727204295.99715: stdout chunk (state=3): >>> # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible<<< 49915 1727204295.99769: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner<<< 49915 1727204295.99813: stdout chunk (state=3): >>> # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize<<< 49915 1727204295.99896: stdout chunk (state=3): >>> # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket<<< 49915 1727204295.99900: stdout chunk (state=3): >>> # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors<<< 49915 1727204295.99985: stdout chunk (state=3): >>> # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse<<< 49915 1727204296.00014: stdout chunk (state=3): >>> # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq<<< 49915 1727204296.00199: stdout chunk (state=3): >>> # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 49915 1727204296.00923: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob<<< 49915 1727204296.00971: stdout chunk (state=3): >>> # destroy ipaddress # destroy ntpath <<< 49915 1727204296.00974: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp<<< 49915 1727204296.01059: stdout chunk (state=3): >>> # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog<<< 49915 1727204296.01063: stdout chunk (state=3): >>> # destroy uuid # destroy selinux <<< 49915 1727204296.01095: stdout chunk (state=3): >>># destroy shutil # destroy distro # destroy distro.distro <<< 49915 1727204296.01191: stdout chunk (state=3): >>># destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection<<< 49915 1727204296.01242: stdout chunk (state=3): >>> # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq<<< 49915 1727204296.01353: stdout chunk (state=3): >>> # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing<<< 49915 1727204296.01378: stdout chunk (state=3): >>> # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess <<< 49915 1727204296.01427: stdout chunk (state=3): >>># destroy base64 # destroy _ssl <<< 49915 1727204296.01444: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios<<< 49915 1727204296.01455: stdout chunk (state=3): >>> # destroy errno<<< 49915 1727204296.01502: stdout chunk (state=3): >>> # destroy json # destroy socket <<< 49915 1727204296.01549: stdout chunk (state=3): >>># destroy struct <<< 49915 1727204296.01611: stdout chunk (state=3): >>># destroy glob # destroy fnmatch<<< 49915 1727204296.01618: stdout chunk (state=3): >>> # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna <<< 49915 1727204296.01720: stdout chunk (state=3): >>># destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian<<< 49915 1727204296.01788: stdout chunk (state=3): >>> # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal<<< 49915 1727204296.01801: stdout chunk (state=3): >>> # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect<<< 49915 1727204296.01855: stdout chunk (state=3): >>> # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re<<< 49915 1727204296.01899: stdout chunk (state=3): >>> # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc<<< 49915 1727204296.01998: stdout chunk (state=3): >>> # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io<<< 49915 1727204296.02064: stdout chunk (state=3): >>> # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon <<< 49915 1727204296.02161: stdout chunk (state=3): >>># destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 49915 1727204296.02465: stdout chunk (state=3): >>># destroy sys.monitoring <<< 49915 1727204296.02468: stdout chunk (state=3): >>># destroy _socket <<< 49915 1727204296.02470: stdout chunk (state=3): >>># destroy _collections <<< 49915 1727204296.02474: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath <<< 49915 1727204296.02478: stdout chunk (state=3): >>># destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg <<< 49915 1727204296.02538: stdout chunk (state=3): >>># destroy contextlib # destroy _typing # destroy _tokenize <<< 49915 1727204296.02638: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator<<< 49915 1727204296.02682: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path <<< 49915 1727204296.02702: stdout chunk (state=3): >>># clear sys.modules # destroy _frozen_importlib<<< 49915 1727204296.02739: stdout chunk (state=3): >>> <<< 49915 1727204296.02808: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases<<< 49915 1727204296.02869: stdout chunk (state=3): >>> # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 <<< 49915 1727204296.02922: stdout chunk (state=3): >>># destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time<<< 49915 1727204296.02964: stdout chunk (state=3): >>> # destroy _random # destroy _weakref <<< 49915 1727204296.03015: stdout chunk (state=3): >>># destroy _hashlib<<< 49915 1727204296.03021: stdout chunk (state=3): >>> # destroy _operator<<< 49915 1727204296.03058: stdout chunk (state=3): >>> # destroy _sre # destroy _string # destroy re # destroy itertools <<< 49915 1727204296.03085: stdout chunk (state=3): >>># destroy _abc # destroy posix<<< 49915 1727204296.03095: stdout chunk (state=3): >>> # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks<<< 49915 1727204296.03346: stdout chunk (state=3): >>> <<< 49915 1727204296.03760: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 49915 1727204296.03763: stdout chunk (state=3): >>><<< 49915 1727204296.03765: stderr chunk (state=3): >>><<< 49915 1727204296.03841: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a856e84d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a856b7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a856eaa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a85499130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a8549a060> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a854d7f80> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a854ec110> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a8550f9b0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a8550ff80> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a854efc50> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a854ed370> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a854d5130> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a8552f8f0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a8552e510> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a854ee210> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a8552cda0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a855609b0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a854d43b0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a85560e60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a85560d10> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a85561100> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a854d2ed0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a855617f0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a855614c0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a855626f0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a8557c8c0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a8557e000> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a8557ee40> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a8557f4a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a8557e390> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a8557ff20> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a8557f650> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a85560da0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a8527fd70> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a852a8890> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a852a85f0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a852a87d0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a852a9160> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a852a9a60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a852a8a40> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a8527df10> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a852aae70> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a852a9bb0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a85562de0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a852d31d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a852fb590> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a853582f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a8535aa50> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a85358410> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a853213a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a851613d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a852fa390> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a852abda0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f3a852fa6f0> # zipimport: found 103 names in '/tmp/ansible_setup_payload_7r0ecd3h/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a851cb0b0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a851a9fa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a851a9100> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a851c8f50> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a851feab0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a851fe840> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a851fe150> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a851fe5a0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a851cbad0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a851ff830> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a851ffa70> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a851fffb0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b2dd30> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a84b2f950> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b34350> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b354f0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b37fb0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a84b380e0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b36270> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b3bef0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b3a9c0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b3a720> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b3ac90> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b36780> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a84b806b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b80320> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a84b81d90> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b81b50> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a84b842f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b82480> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b87ad0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b844a0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a84b888f0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a84b85010> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a84b88c50> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b804d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a84a14380> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a84a15640> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b8ab40> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a84b8bef0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b8a780> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a84a19820> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84a1a5d0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b8a9c0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84a1a3f0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84a1b830> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a84a26240> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84a21a60> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b0eab0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a85226780> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84a260c0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84b89b80> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84ab6630> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a846b4170> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a846b44a0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84aa3290> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84ab71a0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84ab4cb0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84ab48f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a846b73b0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a846b6c60> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a846b6e40> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a846b6090> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a846b7500> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a84716000> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a846b7f50> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84ab49b0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84717c50> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84716e40> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a84752480> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84743260> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a8476a150> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a84769e80> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3a84566c00> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a845640b0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3a8455e750> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "58", "second": "15", "epoch": "1727204295", "epoch_int": "1727204295", "date": "2024-09-24", "time": "14:58:15", "iso8601_micro": "2024-09-24T18:58:15.973246Z", "iso8601": "2024-09-24T18:58:15Z", "iso8601_basic": "20240924T145815973246", "iso8601_basic_short": "20240924T145815", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 49915 1727204296.05307: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204295.3422487-50116-164785714489511/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49915 1727204296.05311: _low_level_execute_command(): starting 49915 1727204296.05313: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204295.3422487-50116-164785714489511/ > /dev/null 2>&1 && sleep 0' 49915 1727204296.05709: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204296.05724: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204296.05758: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204296.05865: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204296.08426: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204296.08430: stderr chunk (state=3): >>><<< 49915 1727204296.08433: stdout chunk (state=3): >>><<< 49915 1727204296.08691: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204296.08695: handler run complete 49915 1727204296.08698: variable 'ansible_facts' from source: unknown 49915 1727204296.08700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204296.08702: variable 'ansible_facts' from source: unknown 49915 1727204296.08721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204296.08771: attempt loop complete, returning result 49915 1727204296.08775: _execute() done 49915 1727204296.08780: dumping result to json 49915 1727204296.08790: done dumping result, returning 49915 1727204296.08798: done running TaskExecutor() for managed-node2/TASK: Gather the minimum subset of ansible_facts required by the network role test [028d2410-947f-dcd7-b5af-0000000000c0] 49915 1727204296.08803: sending task result for task 028d2410-947f-dcd7-b5af-0000000000c0 ok: [managed-node2] 49915 1727204296.09062: no more pending results, returning what we have 49915 1727204296.09064: results queue empty 49915 1727204296.09065: checking for any_errors_fatal 49915 1727204296.09066: done checking for any_errors_fatal 49915 1727204296.09067: checking for max_fail_percentage 49915 1727204296.09068: done checking for max_fail_percentage 49915 1727204296.09069: checking to see if all hosts have failed and the running result is not ok 49915 1727204296.09070: done checking to see if all hosts have failed 49915 1727204296.09070: getting the remaining hosts for this loop 49915 1727204296.09071: done getting the remaining hosts for this loop 49915 1727204296.09075: getting the next task for host managed-node2 49915 1727204296.09084: done getting next task for host managed-node2 49915 1727204296.09086: ^ task is: TASK: Check if system is ostree 49915 1727204296.09088: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204296.09091: getting variables 49915 1727204296.09093: in VariableManager get_vars() 49915 1727204296.09118: Calling all_inventory to load vars for managed-node2 49915 1727204296.09120: Calling groups_inventory to load vars for managed-node2 49915 1727204296.09123: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204296.09206: done sending task result for task 028d2410-947f-dcd7-b5af-0000000000c0 49915 1727204296.09209: WORKER PROCESS EXITING 49915 1727204296.09219: Calling all_plugins_play to load vars for managed-node2 49915 1727204296.09222: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204296.09226: Calling groups_plugins_play to load vars for managed-node2 49915 1727204296.09425: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204296.09706: done with get_vars() 49915 1727204296.09721: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 14:58:16 -0400 (0:00:00.856) 0:00:02.804 ***** 49915 1727204296.09815: entering _queue_task() for managed-node2/stat 49915 1727204296.10303: worker is 1 (out of 1 available) 49915 1727204296.10315: exiting _queue_task() for managed-node2/stat 49915 1727204296.10325: done queuing things up, now waiting for results queue to drain 49915 1727204296.10330: waiting for pending results... 49915 1727204296.10911: running TaskExecutor() for managed-node2/TASK: Check if system is ostree 49915 1727204296.11272: in run() - task 028d2410-947f-dcd7-b5af-0000000000c2 49915 1727204296.11280: variable 'ansible_search_path' from source: unknown 49915 1727204296.11283: variable 'ansible_search_path' from source: unknown 49915 1727204296.11287: calling self._execute() 49915 1727204296.11568: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204296.11695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204296.11700: variable 'omit' from source: magic vars 49915 1727204296.12515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49915 1727204296.13554: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49915 1727204296.13602: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49915 1727204296.13713: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49915 1727204296.14054: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49915 1727204296.14272: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 49915 1727204296.14278: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 49915 1727204296.14281: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204296.14284: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 49915 1727204296.14455: Evaluated conditional (not __network_is_ostree is defined): True 49915 1727204296.14613: variable 'omit' from source: magic vars 49915 1727204296.14719: variable 'omit' from source: magic vars 49915 1727204296.14761: variable 'omit' from source: magic vars 49915 1727204296.14889: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204296.14952: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204296.15051: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204296.15073: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204296.15099: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204296.15247: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204296.15263: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204296.15352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204296.15493: Set connection var ansible_connection to ssh 49915 1727204296.15582: Set connection var ansible_shell_type to sh 49915 1727204296.15585: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204296.15588: Set connection var ansible_shell_executable to /bin/sh 49915 1727204296.15590: Set connection var ansible_timeout to 10 49915 1727204296.15592: Set connection var ansible_pipelining to False 49915 1727204296.15594: variable 'ansible_shell_executable' from source: unknown 49915 1727204296.15596: variable 'ansible_connection' from source: unknown 49915 1727204296.15598: variable 'ansible_module_compression' from source: unknown 49915 1727204296.15600: variable 'ansible_shell_type' from source: unknown 49915 1727204296.15603: variable 'ansible_shell_executable' from source: unknown 49915 1727204296.15605: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204296.15607: variable 'ansible_pipelining' from source: unknown 49915 1727204296.15609: variable 'ansible_timeout' from source: unknown 49915 1727204296.15611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204296.16127: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 49915 1727204296.16130: variable 'omit' from source: magic vars 49915 1727204296.16133: starting attempt loop 49915 1727204296.16135: running the handler 49915 1727204296.16147: _low_level_execute_command(): starting 49915 1727204296.16168: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49915 1727204296.17487: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204296.17502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204296.17771: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204296.17785: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204296.17998: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204296.18207: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204296.20525: stdout chunk (state=3): >>>/root <<< 49915 1727204296.20731: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204296.20806: stderr chunk (state=3): >>><<< 49915 1727204296.20809: stdout chunk (state=3): >>><<< 49915 1727204296.21091: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204296.21103: _low_level_execute_command(): starting 49915 1727204296.21108: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204296.2099302-50161-233815115425965 `" && echo ansible-tmp-1727204296.2099302-50161-233815115425965="` echo /root/.ansible/tmp/ansible-tmp-1727204296.2099302-50161-233815115425965 `" ) && sleep 0' 49915 1727204296.22194: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204296.22210: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204296.22482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204296.22500: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204296.22610: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204296.25428: stdout chunk (state=3): >>>ansible-tmp-1727204296.2099302-50161-233815115425965=/root/.ansible/tmp/ansible-tmp-1727204296.2099302-50161-233815115425965 <<< 49915 1727204296.25695: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204296.25715: stderr chunk (state=3): >>><<< 49915 1727204296.25728: stdout chunk (state=3): >>><<< 49915 1727204296.25804: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204296.2099302-50161-233815115425965=/root/.ansible/tmp/ansible-tmp-1727204296.2099302-50161-233815115425965 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204296.25867: variable 'ansible_module_compression' from source: unknown 49915 1727204296.26084: ANSIBALLZ: Using lock for stat 49915 1727204296.26087: ANSIBALLZ: Acquiring lock 49915 1727204296.26090: ANSIBALLZ: Lock acquired: 140698012047392 49915 1727204296.26092: ANSIBALLZ: Creating module 49915 1727204296.42501: ANSIBALLZ: Writing module into payload 49915 1727204296.42646: ANSIBALLZ: Writing module 49915 1727204296.42678: ANSIBALLZ: Renaming module 49915 1727204296.42691: ANSIBALLZ: Done creating module 49915 1727204296.42715: variable 'ansible_facts' from source: unknown 49915 1727204296.42805: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204296.2099302-50161-233815115425965/AnsiballZ_stat.py 49915 1727204296.43004: Sending initial data 49915 1727204296.43106: Sent initial data (153 bytes) 49915 1727204296.43786: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204296.43895: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204296.43919: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204296.43936: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204296.44050: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204296.46415: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49915 1727204296.46485: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49915 1727204296.46625: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49915ogiz3nec/tmpvih0patr /root/.ansible/tmp/ansible-tmp-1727204296.2099302-50161-233815115425965/AnsiballZ_stat.py <<< 49915 1727204296.46629: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204296.2099302-50161-233815115425965/AnsiballZ_stat.py" <<< 49915 1727204296.46746: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49915ogiz3nec/tmpvih0patr" to remote "/root/.ansible/tmp/ansible-tmp-1727204296.2099302-50161-233815115425965/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204296.2099302-50161-233815115425965/AnsiballZ_stat.py" <<< 49915 1727204296.47447: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204296.47469: stderr chunk (state=3): >>><<< 49915 1727204296.47472: stdout chunk (state=3): >>><<< 49915 1727204296.47491: done transferring module to remote 49915 1727204296.47503: _low_level_execute_command(): starting 49915 1727204296.47507: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204296.2099302-50161-233815115425965/ /root/.ansible/tmp/ansible-tmp-1727204296.2099302-50161-233815115425965/AnsiballZ_stat.py && sleep 0' 49915 1727204296.47929: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204296.47932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 49915 1727204296.47935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204296.47937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204296.47994: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204296.48100: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204296.50721: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204296.50740: stderr chunk (state=3): >>><<< 49915 1727204296.50753: stdout chunk (state=3): >>><<< 49915 1727204296.50829: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204296.50832: _low_level_execute_command(): starting 49915 1727204296.50834: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204296.2099302-50161-233815115425965/AnsiballZ_stat.py && sleep 0' 49915 1727204296.51460: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204296.51463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204296.51466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204296.51470: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204296.51473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204296.51570: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204296.51623: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204296.51839: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204296.54960: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 49915 1727204296.55031: stdout chunk (state=3): >>>import '_thread' # <<< 49915 1727204296.55049: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 49915 1727204296.55132: stdout chunk (state=3): >>> import '_io' # <<< 49915 1727204296.55161: stdout chunk (state=3): >>>import 'marshal' # <<< 49915 1727204296.55206: stdout chunk (state=3): >>>import 'posix' # <<< 49915 1727204296.55263: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 49915 1727204296.55292: stdout chunk (state=3): >>> # installing zipimport hook <<< 49915 1727204296.55329: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook<<< 49915 1727204296.55423: stdout chunk (state=3): >>> # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 49915 1727204296.55453: stdout chunk (state=3): >>>import '_codecs' # <<< 49915 1727204296.55490: stdout chunk (state=3): >>>import 'codecs' # <<< 49915 1727204296.55651: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14178184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14177e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f141781aa50> import '_signal' # import '_abc' # import 'abc' # <<< 49915 1727204296.55721: stdout chunk (state=3): >>>import 'io' # import '_stat' # import 'stat' # <<< 49915 1727204296.55835: stdout chunk (state=3): >>>import '_collections_abc' # <<< 49915 1727204296.55874: stdout chunk (state=3): >>> import 'genericpath' # <<< 49915 1727204296.55888: stdout chunk (state=3): >>>import 'posixpath' # <<< 49915 1727204296.55893: stdout chunk (state=3): >>> <<< 49915 1727204296.55925: stdout chunk (state=3): >>>import 'os' # <<< 49915 1727204296.55956: stdout chunk (state=3): >>> import '_sitebuiltins' # <<< 49915 1727204296.55991: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages<<< 49915 1727204296.56003: stdout chunk (state=3): >>> Adding directory: '/usr/local/lib/python3.12/site-packages'<<< 49915 1727204296.56020: stdout chunk (state=3): >>> Adding directory: '/usr/lib64/python3.12/site-packages'<<< 49915 1727204296.56028: stdout chunk (state=3): >>> Adding directory: '/usr/lib/python3.12/site-packages'<<< 49915 1727204296.56049: stdout chunk (state=3): >>> Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 49915 1727204296.56094: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc'<<< 49915 1727204296.56134: stdout chunk (state=3): >>> <<< 49915 1727204296.56140: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14175e9130> <<< 49915 1727204296.56225: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py<<< 49915 1727204296.56265: stdout chunk (state=3): >>> # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc'<<< 49915 1727204296.56298: stdout chunk (state=3): >>> import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14175ea060> <<< 49915 1727204296.56361: stdout chunk (state=3): >>>import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 49915 1727204296.56724: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py<<< 49915 1727204296.56759: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 49915 1727204296.56795: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py<<< 49915 1727204296.56808: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 49915 1727204296.56900: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc'<<< 49915 1727204296.56908: stdout chunk (state=3): >>> <<< 49915 1727204296.56932: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 49915 1727204296.56982: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc'<<< 49915 1727204296.56985: stdout chunk (state=3): >>> <<< 49915 1727204296.57004: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1417627f80> <<< 49915 1727204296.57064: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc'<<< 49915 1727204296.57072: stdout chunk (state=3): >>> <<< 49915 1727204296.57095: stdout chunk (state=3): >>>import '_operator' # <<< 49915 1727204296.57101: stdout chunk (state=3): >>> import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f141763c110><<< 49915 1727204296.57152: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 49915 1727204296.57173: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc'<<< 49915 1727204296.57224: stdout chunk (state=3): >>> # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 49915 1727204296.57306: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 49915 1727204296.57324: stdout chunk (state=3): >>>import 'itertools' # <<< 49915 1727204296.57369: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 49915 1727204296.57373: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc'<<< 49915 1727204296.57421: stdout chunk (state=3): >>> import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f141765f9b0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 49915 1727204296.57458: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f141765ff80><<< 49915 1727204296.57461: stdout chunk (state=3): >>> import '_collections' # <<< 49915 1727204296.57471: stdout chunk (state=3): >>> <<< 49915 1727204296.57541: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f141763fc50><<< 49915 1727204296.57562: stdout chunk (state=3): >>> import '_functools' # <<< 49915 1727204296.57588: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f141763d370><<< 49915 1727204296.57728: stdout chunk (state=3): >>> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1417625130> <<< 49915 1727204296.57764: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 49915 1727204296.57804: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 49915 1727204296.57856: stdout chunk (state=3): >>>import '_sre' # <<< 49915 1727204296.57861: stdout chunk (state=3): >>> <<< 49915 1727204296.57882: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 49915 1727204296.57898: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 49915 1727204296.57933: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py<<< 49915 1727204296.57936: stdout chunk (state=3): >>> <<< 49915 1727204296.57938: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc'<<< 49915 1727204296.57954: stdout chunk (state=3): >>> <<< 49915 1727204296.57992: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f141767f8f0><<< 49915 1727204296.58020: stdout chunk (state=3): >>> <<< 49915 1727204296.58025: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f141767e510> <<< 49915 1727204296.58065: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 49915 1727204296.58089: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f141763e210><<< 49915 1727204296.58170: stdout chunk (state=3): >>> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f141767cda0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py<<< 49915 1727204296.58203: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' <<< 49915 1727204296.58221: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14176b09b0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14176243b0> <<< 49915 1727204296.58249: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py <<< 49915 1727204296.58254: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc'<<< 49915 1727204296.58306: stdout chunk (state=3): >>> # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 49915 1727204296.58331: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so'<<< 49915 1727204296.58334: stdout chunk (state=3): >>> import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14176b0e60><<< 49915 1727204296.58391: stdout chunk (state=3): >>> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14176b0d10> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 49915 1727204296.58443: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 49915 1727204296.58448: stdout chunk (state=3): >>>import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14176b1100><<< 49915 1727204296.58451: stdout chunk (state=3): >>> <<< 49915 1727204296.58472: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1417622ed0><<< 49915 1727204296.58527: stdout chunk (state=3): >>> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py<<< 49915 1727204296.58530: stdout chunk (state=3): >>> <<< 49915 1727204296.58558: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 49915 1727204296.58589: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py<<< 49915 1727204296.58620: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 49915 1727204296.58654: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14176b17f0><<< 49915 1727204296.58659: stdout chunk (state=3): >>> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14176b14c0> <<< 49915 1727204296.58679: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 49915 1727204296.58713: stdout chunk (state=3): >>> # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py <<< 49915 1727204296.58719: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 49915 1727204296.58751: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14176b26f0><<< 49915 1727204296.58779: stdout chunk (state=3): >>> import 'importlib.util' # <<< 49915 1727204296.58787: stdout chunk (state=3): >>> <<< 49915 1727204296.58831: stdout chunk (state=3): >>>import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py<<< 49915 1727204296.58853: stdout chunk (state=3): >>> <<< 49915 1727204296.58915: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 49915 1727204296.58967: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14176cc8c0><<< 49915 1727204296.58971: stdout chunk (state=3): >>> import 'errno' # <<< 49915 1727204296.58974: stdout chunk (state=3): >>> <<< 49915 1727204296.59018: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so'<<< 49915 1727204296.59021: stdout chunk (state=3): >>> import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14176ce000><<< 49915 1727204296.59059: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py<<< 49915 1727204296.59062: stdout chunk (state=3): >>> <<< 49915 1727204296.59071: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc'<<< 49915 1727204296.59102: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 49915 1727204296.59133: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc'<<< 49915 1727204296.59200: stdout chunk (state=3): >>> import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14176cee40> <<< 49915 1727204296.59207: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so'<<< 49915 1727204296.59227: stdout chunk (state=3): >>> # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14176cf4a0><<< 49915 1727204296.59230: stdout chunk (state=3): >>> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14176ce390><<< 49915 1727204296.59265: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py<<< 49915 1727204296.59270: stdout chunk (state=3): >>> <<< 49915 1727204296.59288: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc'<<< 49915 1727204296.59337: stdout chunk (state=3): >>> # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so'<<< 49915 1727204296.59365: stdout chunk (state=3): >>> # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so'<<< 49915 1727204296.59376: stdout chunk (state=3): >>> import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14176cff20><<< 49915 1727204296.59388: stdout chunk (state=3): >>> <<< 49915 1727204296.59395: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14176cf650><<< 49915 1727204296.59473: stdout chunk (state=3): >>> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14176b0da0> <<< 49915 1727204296.59502: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 49915 1727204296.59550: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc'<<< 49915 1727204296.59554: stdout chunk (state=3): >>> <<< 49915 1727204296.59588: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 49915 1727204296.59624: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 49915 1727204296.59669: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so'<<< 49915 1727204296.59678: stdout chunk (state=3): >>> # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so'<<< 49915 1727204296.59711: stdout chunk (state=3): >>> import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f141744bd70> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py <<< 49915 1727204296.59729: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc'<<< 49915 1727204296.59765: stdout chunk (state=3): >>> # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so'<<< 49915 1727204296.59774: stdout chunk (state=3): >>> # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so'<<< 49915 1727204296.59786: stdout chunk (state=3): >>> <<< 49915 1727204296.59794: stdout chunk (state=3): >>>import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14174748c0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1417474620><<< 49915 1727204296.59825: stdout chunk (state=3): >>> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 49915 1727204296.59844: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 49915 1727204296.59891: stdout chunk (state=3): >>>import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14174748f0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 49915 1727204296.59913: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 49915 1727204296.60004: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 49915 1727204296.60359: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1417475220> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 49915 1727204296.60362: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so'<<< 49915 1727204296.60543: stdout chunk (state=3): >>> import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1417475c10> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1417474ad0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1417449f10> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1417477020> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1417475d60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14176b2de0> <<< 49915 1727204296.60563: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 49915 1727204296.60687: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 49915 1727204296.60711: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 49915 1727204296.60782: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 49915 1727204296.60819: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f141749f3b0><<< 49915 1727204296.60833: stdout chunk (state=3): >>> <<< 49915 1727204296.60908: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py<<< 49915 1727204296.60916: stdout chunk (state=3): >>> <<< 49915 1727204296.60943: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 49915 1727204296.60989: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 49915 1727204296.61016: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 49915 1727204296.61078: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14174c3770><<< 49915 1727204296.61083: stdout chunk (state=3): >>> <<< 49915 1727204296.61110: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py<<< 49915 1727204296.61181: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 49915 1727204296.61278: stdout chunk (state=3): >>>import 'ntpath' # <<< 49915 1727204296.61342: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py <<< 49915 1727204296.61357: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1417524590> <<< 49915 1727204296.61383: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py<<< 49915 1727204296.61427: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc'<<< 49915 1727204296.61431: stdout chunk (state=3): >>> <<< 49915 1727204296.61470: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py<<< 49915 1727204296.61476: stdout chunk (state=3): >>> <<< 49915 1727204296.61527: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc'<<< 49915 1727204296.61666: stdout chunk (state=3): >>> import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1417526cf0> <<< 49915 1727204296.61791: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14175246b0> <<< 49915 1727204296.61850: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14174e9580> <<< 49915 1727204296.61901: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 49915 1727204296.61905: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' <<< 49915 1727204296.61923: stdout chunk (state=3): >>>import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f141732d610> <<< 49915 1727204296.61950: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14174c2570><<< 49915 1727204296.61963: stdout chunk (state=3): >>> <<< 49915 1727204296.61967: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1417477f80> <<< 49915 1727204296.62133: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc'<<< 49915 1727204296.62136: stdout chunk (state=3): >>> <<< 49915 1727204296.62166: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f14174c28d0><<< 49915 1727204296.62171: stdout chunk (state=3): >>> <<< 49915 1727204296.62364: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_bk33vq_q/ansible_stat_payload.zip'<<< 49915 1727204296.62378: stdout chunk (state=3): >>> <<< 49915 1727204296.62383: stdout chunk (state=3): >>># zipimport: zlib available<<< 49915 1727204296.62544: stdout chunk (state=3): >>> <<< 49915 1727204296.62611: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204296.62646: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py<<< 49915 1727204296.62654: stdout chunk (state=3): >>> <<< 49915 1727204296.62690: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc'<<< 49915 1727204296.62693: stdout chunk (state=3): >>> <<< 49915 1727204296.62757: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 49915 1727204296.62872: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc'<<< 49915 1727204296.62877: stdout chunk (state=3): >>> <<< 49915 1727204296.62920: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py<<< 49915 1727204296.62931: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' <<< 49915 1727204296.62949: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f141737f320> <<< 49915 1727204296.62968: stdout chunk (state=3): >>>import '_typing' # <<< 49915 1727204296.62974: stdout chunk (state=3): >>> <<< 49915 1727204296.63242: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1417362210> <<< 49915 1727204296.63258: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1417361370><<< 49915 1727204296.63265: stdout chunk (state=3): >>> <<< 49915 1727204296.63284: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204296.63324: stdout chunk (state=3): >>>import 'ansible' # <<< 49915 1727204296.63336: stdout chunk (state=3): >>> <<< 49915 1727204296.63366: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 49915 1727204296.63371: stdout chunk (state=3): >>> <<< 49915 1727204296.63399: stdout chunk (state=3): >>># zipimport: zlib available<<< 49915 1727204296.63419: stdout chunk (state=3): >>> import 'ansible.module_utils' # <<< 49915 1727204296.63540: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204296.65642: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204296.67543: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 49915 1727204296.67558: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 49915 1727204296.67573: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f141737d010> <<< 49915 1727204296.67615: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 49915 1727204296.67639: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc'<<< 49915 1727204296.67673: stdout chunk (state=3): >>> # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py<<< 49915 1727204296.67679: stdout chunk (state=3): >>> <<< 49915 1727204296.67704: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc'<<< 49915 1727204296.67707: stdout chunk (state=3): >>> <<< 49915 1727204296.67753: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc'<<< 49915 1727204296.67827: stdout chunk (state=3): >>> # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 49915 1727204296.67830: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 49915 1727204296.67849: stdout chunk (state=3): >>>import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14173aab70><<< 49915 1727204296.67889: stdout chunk (state=3): >>> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14173aa900> <<< 49915 1727204296.67943: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14173aa270><<< 49915 1727204296.67973: stdout chunk (state=3): >>> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 49915 1727204296.68009: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 49915 1727204296.68047: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14173aac60> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f141737fd40><<< 49915 1727204296.68092: stdout chunk (state=3): >>> import 'atexit' # <<< 49915 1727204296.68123: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14173ab920><<< 49915 1727204296.68168: stdout chunk (state=3): >>> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14173abb60><<< 49915 1727204296.68214: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py<<< 49915 1727204296.68233: stdout chunk (state=3): >>> <<< 49915 1727204296.68262: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc'<<< 49915 1727204296.68289: stdout chunk (state=3): >>> import '_locale' # <<< 49915 1727204296.68359: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14173abf20><<< 49915 1727204296.68373: stdout chunk (state=3): >>> import 'pwd' # <<< 49915 1727204296.68432: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc'<<< 49915 1727204296.68438: stdout chunk (state=3): >>> <<< 49915 1727204296.68494: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416d0ddc0> <<< 49915 1727204296.68540: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so'<<< 49915 1727204296.68557: stdout chunk (state=3): >>> # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so'<<< 49915 1727204296.68588: stdout chunk (state=3): >>> import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1416d0f9e0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 49915 1727204296.68600: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc'<<< 49915 1727204296.68654: stdout chunk (state=3): >>> import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416d143b0> <<< 49915 1727204296.68691: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 49915 1727204296.68724: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc'<<< 49915 1727204296.68729: stdout chunk (state=3): >>> <<< 49915 1727204296.68754: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416d15550><<< 49915 1727204296.68759: stdout chunk (state=3): >>> <<< 49915 1727204296.68839: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc'<<< 49915 1727204296.68855: stdout chunk (state=3): >>> <<< 49915 1727204296.68874: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py<<< 49915 1727204296.68889: stdout chunk (state=3): >>> <<< 49915 1727204296.68895: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 49915 1727204296.69001: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416d17f20><<< 49915 1727204296.69028: stdout chunk (state=3): >>> <<< 49915 1727204296.69107: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1416d1c380> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416d16300><<< 49915 1727204296.69110: stdout chunk (state=3): >>> <<< 49915 1727204296.69167: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 49915 1727204296.69207: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 49915 1727204296.69233: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 49915 1727204296.69292: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc'<<< 49915 1727204296.69296: stdout chunk (state=3): >>> <<< 49915 1727204296.69363: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 49915 1727204296.69382: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416d1ff50><<< 49915 1727204296.69392: stdout chunk (state=3): >>> import '_tokenize' # <<< 49915 1727204296.69487: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416d1eae0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416d1e840><<< 49915 1727204296.69524: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 49915 1727204296.69544: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 49915 1727204296.69692: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416d1edb0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416d167e0><<< 49915 1727204296.69697: stdout chunk (state=3): >>> <<< 49915 1727204296.69741: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so'<<< 49915 1727204296.69744: stdout chunk (state=3): >>> <<< 49915 1727204296.69755: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 49915 1727204296.69793: stdout chunk (state=3): >>>import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1416d64200> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py <<< 49915 1727204296.69797: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc'<<< 49915 1727204296.69810: stdout chunk (state=3): >>> import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416d64380><<< 49915 1727204296.69851: stdout chunk (state=3): >>> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 49915 1727204296.69890: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 49915 1727204296.69937: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 49915 1727204296.69994: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1416d65e20> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416d65be0><<< 49915 1727204296.70037: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 49915 1727204296.70214: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc'<<< 49915 1727204296.70282: stdout chunk (state=3): >>> # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so'<<< 49915 1727204296.70289: stdout chunk (state=3): >>> # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so'<<< 49915 1727204296.70324: stdout chunk (state=3): >>> import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1416d68380> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416d66510><<< 49915 1727204296.70327: stdout chunk (state=3): >>> <<< 49915 1727204296.70357: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py<<< 49915 1727204296.70417: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 49915 1727204296.70445: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 49915 1727204296.70469: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 49915 1727204296.70492: stdout chunk (state=3): >>>import '_string' # <<< 49915 1727204296.70560: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416d6ba40> <<< 49915 1727204296.70824: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416d68500> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 49915 1727204296.70830: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 49915 1727204296.70857: stdout chunk (state=3): >>>import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1416d6cb60> <<< 49915 1727204296.70889: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so'<<< 49915 1727204296.70900: stdout chunk (state=3): >>> # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so'<<< 49915 1727204296.70911: stdout chunk (state=3): >>> import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1416d6cd10><<< 49915 1727204296.70996: stdout chunk (state=3): >>> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 49915 1727204296.71002: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1416d6cc20><<< 49915 1727204296.71017: stdout chunk (state=3): >>> <<< 49915 1727204296.71037: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416d644a0> <<< 49915 1727204296.71069: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py<<< 49915 1727204296.71092: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc'<<< 49915 1727204296.71101: stdout chunk (state=3): >>> <<< 49915 1727204296.71170: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 49915 1727204296.71212: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so'<<< 49915 1727204296.71219: stdout chunk (state=3): >>> <<< 49915 1727204296.71264: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1416df8470><<< 49915 1727204296.71335: stdout chunk (state=3): >>> <<< 49915 1727204296.71511: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 49915 1727204296.71540: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 49915 1727204296.71549: stdout chunk (state=3): >>>import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1416df9850> <<< 49915 1727204296.71574: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416d6ec00> <<< 49915 1727204296.71615: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so'<<< 49915 1727204296.71625: stdout chunk (state=3): >>> # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1416d6ffb0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416d6e840><<< 49915 1727204296.71738: stdout chunk (state=3): >>> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 49915 1727204296.71827: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204296.71956: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204296.71977: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204296.71992: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 49915 1727204296.72013: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204296.72043: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204296.72046: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # <<< 49915 1727204296.72073: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204296.72254: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204296.72437: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204296.73334: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204296.74220: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 49915 1727204296.74443: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1416dfda00> <<< 49915 1727204296.74467: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 49915 1727204296.74474: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 49915 1727204296.74505: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416dfe7b0> <<< 49915 1727204296.74519: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416df9670> <<< 49915 1727204296.74581: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 49915 1727204296.74600: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204296.74628: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204296.74646: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 49915 1727204296.74669: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204296.74893: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204296.75124: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 49915 1727204296.75144: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 49915 1727204296.75165: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416dfe810> <<< 49915 1727204296.75184: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204296.75928: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204296.76624: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204296.76732: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204296.76837: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 49915 1727204296.76861: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204296.76909: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204296.76961: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 49915 1727204296.76979: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204296.77085: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204296.77213: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 49915 1727204296.77240: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204296.77261: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204296.77284: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # <<< 49915 1727204296.77436: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 49915 1727204296.77765: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204296.78136: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 49915 1727204296.78230: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 49915 1727204296.78253: stdout chunk (state=3): >>>import '_ast' # <<< 49915 1727204296.78362: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416dff9e0> <<< 49915 1727204296.78382: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204296.78497: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204296.78604: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 49915 1727204296.78624: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # <<< 49915 1727204296.78635: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # <<< 49915 1727204296.78655: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 49915 1727204296.78687: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204296.78753: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204296.78804: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 49915 1727204296.78826: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204296.78889: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204296.79036: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 49915 1727204296.79121: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 49915 1727204296.79186: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 49915 1727204296.79296: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 49915 1727204296.79440: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 49915 1727204296.79446: stdout chunk (state=3): >>>import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1416c0a210> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416c07fe0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 49915 1727204296.79496: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204296.79587: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204296.79625: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204296.79687: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc'<<< 49915 1727204296.79692: stdout chunk (state=3): >>> <<< 49915 1727204296.79727: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py<<< 49915 1727204296.79774: stdout chunk (state=3): >>> # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc'<<< 49915 1727204296.79783: stdout chunk (state=3): >>> <<< 49915 1727204296.79809: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 49915 1727204296.79937: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 49915 1727204296.79978: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc'<<< 49915 1727204296.80068: stdout chunk (state=3): >>> import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14173e6a50><<< 49915 1727204296.80076: stdout chunk (state=3): >>> <<< 49915 1727204296.80141: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14173f6720><<< 49915 1727204296.80148: stdout chunk (state=3): >>> <<< 49915 1727204296.80264: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416c0a060><<< 49915 1727204296.80268: stdout chunk (state=3): >>> <<< 49915 1727204296.80298: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416d6ea80><<< 49915 1727204296.80301: stdout chunk (state=3): >>> # destroy ansible.module_utils.distro <<< 49915 1727204296.80305: stdout chunk (state=3): >>>import 'ansible.module_utils.distro' # <<< 49915 1727204296.80327: stdout chunk (state=3): >>># zipimport: zlib available <<< 49915 1727204296.80407: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common._utils' # <<< 49915 1727204296.80425: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 49915 1727204296.80500: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 49915 1727204296.80529: stdout chunk (state=3): >>> # zipimport: zlib available <<< 49915 1727204296.80558: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 49915 1727204296.80588: stdout chunk (state=3): >>> # zipimport: zlib available <<< 49915 1727204296.80831: stdout chunk (state=3): >>># zipimport: zlib available<<< 49915 1727204296.81040: stdout chunk (state=3): >>> <<< 49915 1727204296.81109: stdout chunk (state=3): >>># zipimport: zlib available<<< 49915 1727204296.81115: stdout chunk (state=3): >>> <<< 49915 1727204296.81253: stdout chunk (state=3): >>> <<< 49915 1727204296.81256: stdout chunk (state=3): >>>{"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 49915 1727204296.81302: stdout chunk (state=3): >>># destroy __main__ <<< 49915 1727204296.81745: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 49915 1727204296.81819: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._<<< 49915 1727204296.81859: stdout chunk (state=3): >>> # clear sys.path # clear sys.argv # clear sys.ps1 <<< 49915 1727204296.81866: stdout chunk (state=3): >>># clear sys.ps2 # clear sys.last_exc <<< 49915 1727204296.81951: stdout chunk (state=3): >>># clear sys.last_type # clear sys.last_value # clear sys.last_traceback <<< 49915 1727204296.81957: stdout chunk (state=3): >>># clear sys.__interactivehook__<<< 49915 1727204296.81960: stdout chunk (state=3): >>> # clear sys.meta_path <<< 49915 1727204296.81973: stdout chunk (state=3): >>># restore sys.stdin # restore sys.stdout <<< 49915 1727204296.82008: stdout chunk (state=3): >>># restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp<<< 49915 1727204296.82051: stdout chunk (state=3): >>> # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases<<< 49915 1727204296.82106: stdout chunk (state=3): >>> # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat<<< 49915 1727204296.82167: stdout chunk (state=3): >>> # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools<<< 49915 1727204296.82172: stdout chunk (state=3): >>> # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery<<< 49915 1727204296.82178: stdout chunk (state=3): >>> # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2<<< 49915 1727204296.82229: stdout chunk (state=3): >>> # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib<<< 49915 1727204296.82233: stdout chunk (state=3): >>> # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile<<< 49915 1727204296.82258: stdout chunk (state=3): >>> # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder<<< 49915 1727204296.82572: stdout chunk (state=3): >>> # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # des<<< 49915 1727204296.82580: stdout chunk (state=3): >>>troy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 49915 1727204296.82665: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 49915 1727204296.82672: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 49915 1727204296.82710: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma <<< 49915 1727204296.82715: stdout chunk (state=3): >>># destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch <<< 49915 1727204296.82737: stdout chunk (state=3): >>># destroy ipaddress <<< 49915 1727204296.82808: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib<<< 49915 1727204296.82841: stdout chunk (state=3): >>> # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal <<< 49915 1727204296.82895: stdout chunk (state=3): >>># destroy _posixsubprocess # destroy syslog # destroy uuid <<< 49915 1727204296.82902: stdout chunk (state=3): >>># destroy selectors # destroy errno # destroy array # destroy datetime <<< 49915 1727204296.83062: stdout chunk (state=3): >>># destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser <<< 49915 1727204296.83069: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath <<< 49915 1727204296.83100: stdout chunk (state=3): >>># cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 <<< 49915 1727204296.83114: stdout chunk (state=3): >>># cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io <<< 49915 1727204296.83131: stdout chunk (state=3): >>># cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 49915 1727204296.83145: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 49915 1727204296.83316: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 49915 1727204296.83332: stdout chunk (state=3): >>># destroy _collections <<< 49915 1727204296.83881: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 49915 1727204296.83884: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools <<< 49915 1727204296.83886: stdout chunk (state=3): >>># destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 49915 1727204296.84093: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204296.84141: stderr chunk (state=3): >>>Shared connection to 10.31.13.254 closed. <<< 49915 1727204296.84213: stderr chunk (state=3): >>><<< 49915 1727204296.84225: stdout chunk (state=3): >>><<< 49915 1727204296.84414: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14178184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14177e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f141781aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14175e9130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14175ea060> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1417627f80> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f141763c110> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f141765f9b0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f141765ff80> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f141763fc50> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f141763d370> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1417625130> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f141767f8f0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f141767e510> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f141763e210> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f141767cda0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14176b09b0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14176243b0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14176b0e60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14176b0d10> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14176b1100> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1417622ed0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14176b17f0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14176b14c0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14176b26f0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14176cc8c0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14176ce000> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14176cee40> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14176cf4a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14176ce390> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14176cff20> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14176cf650> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14176b0da0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f141744bd70> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14174748c0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1417474620> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14174748f0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1417475220> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1417475c10> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1417474ad0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1417449f10> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1417477020> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1417475d60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14176b2de0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f141749f3b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14174c3770> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1417524590> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1417526cf0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14175246b0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14174e9580> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f141732d610> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14174c2570> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1417477f80> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f14174c28d0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_bk33vq_q/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f141737f320> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1417362210> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1417361370> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f141737d010> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14173aab70> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14173aa900> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14173aa270> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14173aac60> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f141737fd40> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14173ab920> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f14173abb60> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14173abf20> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416d0ddc0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1416d0f9e0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416d143b0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416d15550> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416d17f20> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1416d1c380> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416d16300> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416d1ff50> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416d1eae0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416d1e840> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416d1edb0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416d167e0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1416d64200> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416d64380> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1416d65e20> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416d65be0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1416d68380> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416d66510> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416d6ba40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416d68500> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1416d6cb60> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1416d6cd10> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1416d6cc20> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416d644a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1416df8470> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1416df9850> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416d6ec00> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1416d6ffb0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416d6e840> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1416dfda00> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416dfe7b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416df9670> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416dfe810> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416dff9e0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1416c0a210> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416c07fe0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14173e6a50> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f14173f6720> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416c0a060> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1416d6ea80> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 49915 1727204296.85010: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204296.2099302-50161-233815115425965/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49915 1727204296.85013: _low_level_execute_command(): starting 49915 1727204296.85015: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204296.2099302-50161-233815115425965/ > /dev/null 2>&1 && sleep 0' 49915 1727204296.85154: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204296.85158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 49915 1727204296.85167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204296.85169: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204296.85171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204296.85215: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204296.85219: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204296.85303: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204296.87956: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204296.87972: stderr chunk (state=3): >>><<< 49915 1727204296.87978: stdout chunk (state=3): >>><<< 49915 1727204296.87995: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204296.88181: handler run complete 49915 1727204296.88184: attempt loop complete, returning result 49915 1727204296.88187: _execute() done 49915 1727204296.88189: dumping result to json 49915 1727204296.88191: done dumping result, returning 49915 1727204296.88194: done running TaskExecutor() for managed-node2/TASK: Check if system is ostree [028d2410-947f-dcd7-b5af-0000000000c2] 49915 1727204296.88196: sending task result for task 028d2410-947f-dcd7-b5af-0000000000c2 ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 49915 1727204296.88315: no more pending results, returning what we have 49915 1727204296.88318: results queue empty 49915 1727204296.88319: checking for any_errors_fatal 49915 1727204296.88325: done checking for any_errors_fatal 49915 1727204296.88326: checking for max_fail_percentage 49915 1727204296.88327: done checking for max_fail_percentage 49915 1727204296.88328: checking to see if all hosts have failed and the running result is not ok 49915 1727204296.88329: done checking to see if all hosts have failed 49915 1727204296.88330: getting the remaining hosts for this loop 49915 1727204296.88331: done getting the remaining hosts for this loop 49915 1727204296.88335: getting the next task for host managed-node2 49915 1727204296.88340: done getting next task for host managed-node2 49915 1727204296.88342: ^ task is: TASK: Set flag to indicate system is ostree 49915 1727204296.88344: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204296.88348: getting variables 49915 1727204296.88350: in VariableManager get_vars() 49915 1727204296.88493: Calling all_inventory to load vars for managed-node2 49915 1727204296.88496: Calling groups_inventory to load vars for managed-node2 49915 1727204296.88499: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204296.88519: Calling all_plugins_play to load vars for managed-node2 49915 1727204296.88522: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204296.88526: Calling groups_plugins_play to load vars for managed-node2 49915 1727204296.88754: done sending task result for task 028d2410-947f-dcd7-b5af-0000000000c2 49915 1727204296.88757: WORKER PROCESS EXITING 49915 1727204296.88772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204296.88959: done with get_vars() 49915 1727204296.88969: done getting variables 49915 1727204296.89065: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 14:58:16 -0400 (0:00:00.792) 0:00:03.597 ***** 49915 1727204296.89095: entering _queue_task() for managed-node2/set_fact 49915 1727204296.89097: Creating lock for set_fact 49915 1727204296.89369: worker is 1 (out of 1 available) 49915 1727204296.89584: exiting _queue_task() for managed-node2/set_fact 49915 1727204296.89592: done queuing things up, now waiting for results queue to drain 49915 1727204296.89594: waiting for pending results... 49915 1727204296.89720: running TaskExecutor() for managed-node2/TASK: Set flag to indicate system is ostree 49915 1727204296.89727: in run() - task 028d2410-947f-dcd7-b5af-0000000000c3 49915 1727204296.89739: variable 'ansible_search_path' from source: unknown 49915 1727204296.89743: variable 'ansible_search_path' from source: unknown 49915 1727204296.89778: calling self._execute() 49915 1727204296.89902: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204296.89926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204296.89930: variable 'omit' from source: magic vars 49915 1727204296.90479: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49915 1727204296.90729: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49915 1727204296.90761: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49915 1727204296.90792: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49915 1727204296.90815: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49915 1727204296.90880: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 49915 1727204296.90899: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 49915 1727204296.90923: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204296.90940: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 49915 1727204296.91031: Evaluated conditional (not __network_is_ostree is defined): True 49915 1727204296.91034: variable 'omit' from source: magic vars 49915 1727204296.91058: variable 'omit' from source: magic vars 49915 1727204296.91139: variable '__ostree_booted_stat' from source: set_fact 49915 1727204296.91177: variable 'omit' from source: magic vars 49915 1727204296.91196: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204296.91220: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204296.91238: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204296.91251: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204296.91259: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204296.91283: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204296.91286: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204296.91288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204296.91355: Set connection var ansible_connection to ssh 49915 1727204296.91358: Set connection var ansible_shell_type to sh 49915 1727204296.91362: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204296.91370: Set connection var ansible_shell_executable to /bin/sh 49915 1727204296.91377: Set connection var ansible_timeout to 10 49915 1727204296.91384: Set connection var ansible_pipelining to False 49915 1727204296.91400: variable 'ansible_shell_executable' from source: unknown 49915 1727204296.91403: variable 'ansible_connection' from source: unknown 49915 1727204296.91405: variable 'ansible_module_compression' from source: unknown 49915 1727204296.91407: variable 'ansible_shell_type' from source: unknown 49915 1727204296.91410: variable 'ansible_shell_executable' from source: unknown 49915 1727204296.91412: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204296.91418: variable 'ansible_pipelining' from source: unknown 49915 1727204296.91420: variable 'ansible_timeout' from source: unknown 49915 1727204296.91424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204296.91500: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204296.91509: variable 'omit' from source: magic vars 49915 1727204296.91516: starting attempt loop 49915 1727204296.91519: running the handler 49915 1727204296.91526: handler run complete 49915 1727204296.91533: attempt loop complete, returning result 49915 1727204296.91536: _execute() done 49915 1727204296.91538: dumping result to json 49915 1727204296.91541: done dumping result, returning 49915 1727204296.91547: done running TaskExecutor() for managed-node2/TASK: Set flag to indicate system is ostree [028d2410-947f-dcd7-b5af-0000000000c3] 49915 1727204296.91556: sending task result for task 028d2410-947f-dcd7-b5af-0000000000c3 49915 1727204296.91632: done sending task result for task 028d2410-947f-dcd7-b5af-0000000000c3 49915 1727204296.91635: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 49915 1727204296.91708: no more pending results, returning what we have 49915 1727204296.91711: results queue empty 49915 1727204296.91714: checking for any_errors_fatal 49915 1727204296.91720: done checking for any_errors_fatal 49915 1727204296.91721: checking for max_fail_percentage 49915 1727204296.91722: done checking for max_fail_percentage 49915 1727204296.91723: checking to see if all hosts have failed and the running result is not ok 49915 1727204296.91724: done checking to see if all hosts have failed 49915 1727204296.91724: getting the remaining hosts for this loop 49915 1727204296.91726: done getting the remaining hosts for this loop 49915 1727204296.91729: getting the next task for host managed-node2 49915 1727204296.91736: done getting next task for host managed-node2 49915 1727204296.91738: ^ task is: TASK: Fix CentOS6 Base repo 49915 1727204296.91741: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204296.91744: getting variables 49915 1727204296.91745: in VariableManager get_vars() 49915 1727204296.91768: Calling all_inventory to load vars for managed-node2 49915 1727204296.91770: Calling groups_inventory to load vars for managed-node2 49915 1727204296.91773: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204296.91783: Calling all_plugins_play to load vars for managed-node2 49915 1727204296.91785: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204296.91793: Calling groups_plugins_play to load vars for managed-node2 49915 1727204296.91996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204296.92189: done with get_vars() 49915 1727204296.92203: done getting variables 49915 1727204296.92358: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 14:58:16 -0400 (0:00:00.033) 0:00:03.630 ***** 49915 1727204296.92427: entering _queue_task() for managed-node2/copy 49915 1727204296.93103: worker is 1 (out of 1 available) 49915 1727204296.93117: exiting _queue_task() for managed-node2/copy 49915 1727204296.93129: done queuing things up, now waiting for results queue to drain 49915 1727204296.93130: waiting for pending results... 49915 1727204296.93419: running TaskExecutor() for managed-node2/TASK: Fix CentOS6 Base repo 49915 1727204296.93558: in run() - task 028d2410-947f-dcd7-b5af-0000000000c5 49915 1727204296.93620: variable 'ansible_search_path' from source: unknown 49915 1727204296.93623: variable 'ansible_search_path' from source: unknown 49915 1727204296.93626: calling self._execute() 49915 1727204296.93909: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204296.93915: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204296.93918: variable 'omit' from source: magic vars 49915 1727204296.94687: variable 'ansible_distribution' from source: facts 49915 1727204296.94718: Evaluated conditional (ansible_distribution == 'CentOS'): True 49915 1727204296.94837: variable 'ansible_distribution_major_version' from source: facts 49915 1727204296.94844: Evaluated conditional (ansible_distribution_major_version == '6'): False 49915 1727204296.94847: when evaluation is False, skipping this task 49915 1727204296.94850: _execute() done 49915 1727204296.94853: dumping result to json 49915 1727204296.94857: done dumping result, returning 49915 1727204296.94863: done running TaskExecutor() for managed-node2/TASK: Fix CentOS6 Base repo [028d2410-947f-dcd7-b5af-0000000000c5] 49915 1727204296.94869: sending task result for task 028d2410-947f-dcd7-b5af-0000000000c5 49915 1727204296.95045: done sending task result for task 028d2410-947f-dcd7-b5af-0000000000c5 49915 1727204296.95048: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 49915 1727204296.95109: no more pending results, returning what we have 49915 1727204296.95112: results queue empty 49915 1727204296.95113: checking for any_errors_fatal 49915 1727204296.95116: done checking for any_errors_fatal 49915 1727204296.95117: checking for max_fail_percentage 49915 1727204296.95118: done checking for max_fail_percentage 49915 1727204296.95119: checking to see if all hosts have failed and the running result is not ok 49915 1727204296.95120: done checking to see if all hosts have failed 49915 1727204296.95121: getting the remaining hosts for this loop 49915 1727204296.95122: done getting the remaining hosts for this loop 49915 1727204296.95125: getting the next task for host managed-node2 49915 1727204296.95130: done getting next task for host managed-node2 49915 1727204296.95132: ^ task is: TASK: Include the task 'enable_epel.yml' 49915 1727204296.95135: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204296.95139: getting variables 49915 1727204296.95140: in VariableManager get_vars() 49915 1727204296.95170: Calling all_inventory to load vars for managed-node2 49915 1727204296.95173: Calling groups_inventory to load vars for managed-node2 49915 1727204296.95178: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204296.95188: Calling all_plugins_play to load vars for managed-node2 49915 1727204296.95191: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204296.95195: Calling groups_plugins_play to load vars for managed-node2 49915 1727204296.95431: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204296.95626: done with get_vars() 49915 1727204296.95636: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 14:58:16 -0400 (0:00:00.033) 0:00:03.663 ***** 49915 1727204296.95731: entering _queue_task() for managed-node2/include_tasks 49915 1727204296.96180: worker is 1 (out of 1 available) 49915 1727204296.96187: exiting _queue_task() for managed-node2/include_tasks 49915 1727204296.96197: done queuing things up, now waiting for results queue to drain 49915 1727204296.96198: waiting for pending results... 49915 1727204296.96491: running TaskExecutor() for managed-node2/TASK: Include the task 'enable_epel.yml' 49915 1727204296.96496: in run() - task 028d2410-947f-dcd7-b5af-0000000000c6 49915 1727204296.96499: variable 'ansible_search_path' from source: unknown 49915 1727204296.96502: variable 'ansible_search_path' from source: unknown 49915 1727204296.96505: calling self._execute() 49915 1727204296.96507: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204296.96510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204296.96515: variable 'omit' from source: magic vars 49915 1727204296.97360: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49915 1727204296.99633: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49915 1727204296.99723: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49915 1727204296.99763: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49915 1727204296.99814: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49915 1727204296.99844: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49915 1727204296.99934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204297.00009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204297.00015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204297.00050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204297.00068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204297.00194: variable '__network_is_ostree' from source: set_fact 49915 1727204297.00225: Evaluated conditional (not __network_is_ostree | d(false)): True 49915 1727204297.00238: _execute() done 49915 1727204297.00336: dumping result to json 49915 1727204297.00341: done dumping result, returning 49915 1727204297.00343: done running TaskExecutor() for managed-node2/TASK: Include the task 'enable_epel.yml' [028d2410-947f-dcd7-b5af-0000000000c6] 49915 1727204297.00346: sending task result for task 028d2410-947f-dcd7-b5af-0000000000c6 49915 1727204297.00423: done sending task result for task 028d2410-947f-dcd7-b5af-0000000000c6 49915 1727204297.00427: WORKER PROCESS EXITING 49915 1727204297.00458: no more pending results, returning what we have 49915 1727204297.00463: in VariableManager get_vars() 49915 1727204297.00498: Calling all_inventory to load vars for managed-node2 49915 1727204297.00501: Calling groups_inventory to load vars for managed-node2 49915 1727204297.00504: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204297.00514: Calling all_plugins_play to load vars for managed-node2 49915 1727204297.00517: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204297.00519: Calling groups_plugins_play to load vars for managed-node2 49915 1727204297.00757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204297.00967: done with get_vars() 49915 1727204297.00974: variable 'ansible_search_path' from source: unknown 49915 1727204297.00978: variable 'ansible_search_path' from source: unknown 49915 1727204297.01023: we have included files to process 49915 1727204297.01024: generating all_blocks data 49915 1727204297.01026: done generating all_blocks data 49915 1727204297.01030: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 49915 1727204297.01032: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 49915 1727204297.01034: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 49915 1727204297.01785: done processing included file 49915 1727204297.01788: iterating over new_blocks loaded from include file 49915 1727204297.01789: in VariableManager get_vars() 49915 1727204297.01801: done with get_vars() 49915 1727204297.01803: filtering new block on tags 49915 1727204297.01828: done filtering new block on tags 49915 1727204297.01831: in VariableManager get_vars() 49915 1727204297.01842: done with get_vars() 49915 1727204297.01843: filtering new block on tags 49915 1727204297.01863: done filtering new block on tags 49915 1727204297.01866: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node2 49915 1727204297.01871: extending task lists for all hosts with included blocks 49915 1727204297.01986: done extending task lists 49915 1727204297.01988: done processing included files 49915 1727204297.01989: results queue empty 49915 1727204297.01989: checking for any_errors_fatal 49915 1727204297.01992: done checking for any_errors_fatal 49915 1727204297.01992: checking for max_fail_percentage 49915 1727204297.01994: done checking for max_fail_percentage 49915 1727204297.01994: checking to see if all hosts have failed and the running result is not ok 49915 1727204297.01995: done checking to see if all hosts have failed 49915 1727204297.01996: getting the remaining hosts for this loop 49915 1727204297.01997: done getting the remaining hosts for this loop 49915 1727204297.01999: getting the next task for host managed-node2 49915 1727204297.02006: done getting next task for host managed-node2 49915 1727204297.02008: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 49915 1727204297.02010: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204297.02014: getting variables 49915 1727204297.02015: in VariableManager get_vars() 49915 1727204297.02023: Calling all_inventory to load vars for managed-node2 49915 1727204297.02024: Calling groups_inventory to load vars for managed-node2 49915 1727204297.02027: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204297.02032: Calling all_plugins_play to load vars for managed-node2 49915 1727204297.02039: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204297.02042: Calling groups_plugins_play to load vars for managed-node2 49915 1727204297.02217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204297.02422: done with get_vars() 49915 1727204297.02431: done getting variables 49915 1727204297.02496: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 49915 1727204297.02703: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 14:58:17 -0400 (0:00:00.070) 0:00:03.733 ***** 49915 1727204297.02759: entering _queue_task() for managed-node2/command 49915 1727204297.02761: Creating lock for command 49915 1727204297.03194: worker is 1 (out of 1 available) 49915 1727204297.03204: exiting _queue_task() for managed-node2/command 49915 1727204297.03217: done queuing things up, now waiting for results queue to drain 49915 1727204297.03219: waiting for pending results... 49915 1727204297.03462: running TaskExecutor() for managed-node2/TASK: Create EPEL 10 49915 1727204297.03487: in run() - task 028d2410-947f-dcd7-b5af-0000000000e0 49915 1727204297.03509: variable 'ansible_search_path' from source: unknown 49915 1727204297.03520: variable 'ansible_search_path' from source: unknown 49915 1727204297.03569: calling self._execute() 49915 1727204297.03643: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204297.03654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204297.03682: variable 'omit' from source: magic vars 49915 1727204297.04087: variable 'ansible_distribution' from source: facts 49915 1727204297.04114: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 49915 1727204297.04280: variable 'ansible_distribution_major_version' from source: facts 49915 1727204297.04283: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 49915 1727204297.04286: when evaluation is False, skipping this task 49915 1727204297.04288: _execute() done 49915 1727204297.04290: dumping result to json 49915 1727204297.04292: done dumping result, returning 49915 1727204297.04294: done running TaskExecutor() for managed-node2/TASK: Create EPEL 10 [028d2410-947f-dcd7-b5af-0000000000e0] 49915 1727204297.04296: sending task result for task 028d2410-947f-dcd7-b5af-0000000000e0 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 49915 1727204297.04621: no more pending results, returning what we have 49915 1727204297.04623: results queue empty 49915 1727204297.04624: checking for any_errors_fatal 49915 1727204297.04625: done checking for any_errors_fatal 49915 1727204297.04626: checking for max_fail_percentage 49915 1727204297.04627: done checking for max_fail_percentage 49915 1727204297.04628: checking to see if all hosts have failed and the running result is not ok 49915 1727204297.04629: done checking to see if all hosts have failed 49915 1727204297.04629: getting the remaining hosts for this loop 49915 1727204297.04630: done getting the remaining hosts for this loop 49915 1727204297.04633: getting the next task for host managed-node2 49915 1727204297.04639: done getting next task for host managed-node2 49915 1727204297.04641: ^ task is: TASK: Install yum-utils package 49915 1727204297.04644: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204297.04647: getting variables 49915 1727204297.04648: in VariableManager get_vars() 49915 1727204297.04672: Calling all_inventory to load vars for managed-node2 49915 1727204297.04674: Calling groups_inventory to load vars for managed-node2 49915 1727204297.04679: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204297.04690: Calling all_plugins_play to load vars for managed-node2 49915 1727204297.04693: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204297.04695: Calling groups_plugins_play to load vars for managed-node2 49915 1727204297.04967: done sending task result for task 028d2410-947f-dcd7-b5af-0000000000e0 49915 1727204297.04970: WORKER PROCESS EXITING 49915 1727204297.05001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204297.05198: done with get_vars() 49915 1727204297.05207: done getting variables 49915 1727204297.05308: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 14:58:17 -0400 (0:00:00.025) 0:00:03.759 ***** 49915 1727204297.05343: entering _queue_task() for managed-node2/package 49915 1727204297.05345: Creating lock for package 49915 1727204297.05699: worker is 1 (out of 1 available) 49915 1727204297.05710: exiting _queue_task() for managed-node2/package 49915 1727204297.05722: done queuing things up, now waiting for results queue to drain 49915 1727204297.05723: waiting for pending results... 49915 1727204297.05909: running TaskExecutor() for managed-node2/TASK: Install yum-utils package 49915 1727204297.06029: in run() - task 028d2410-947f-dcd7-b5af-0000000000e1 49915 1727204297.06047: variable 'ansible_search_path' from source: unknown 49915 1727204297.06059: variable 'ansible_search_path' from source: unknown 49915 1727204297.06108: calling self._execute() 49915 1727204297.06190: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204297.06210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204297.06228: variable 'omit' from source: magic vars 49915 1727204297.06628: variable 'ansible_distribution' from source: facts 49915 1727204297.06654: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 49915 1727204297.06802: variable 'ansible_distribution_major_version' from source: facts 49915 1727204297.06816: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 49915 1727204297.06824: when evaluation is False, skipping this task 49915 1727204297.06834: _execute() done 49915 1727204297.06845: dumping result to json 49915 1727204297.06862: done dumping result, returning 49915 1727204297.06872: done running TaskExecutor() for managed-node2/TASK: Install yum-utils package [028d2410-947f-dcd7-b5af-0000000000e1] 49915 1727204297.06887: sending task result for task 028d2410-947f-dcd7-b5af-0000000000e1 49915 1727204297.07108: done sending task result for task 028d2410-947f-dcd7-b5af-0000000000e1 49915 1727204297.07114: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 49915 1727204297.07163: no more pending results, returning what we have 49915 1727204297.07166: results queue empty 49915 1727204297.07167: checking for any_errors_fatal 49915 1727204297.07233: done checking for any_errors_fatal 49915 1727204297.07234: checking for max_fail_percentage 49915 1727204297.07236: done checking for max_fail_percentage 49915 1727204297.07237: checking to see if all hosts have failed and the running result is not ok 49915 1727204297.07238: done checking to see if all hosts have failed 49915 1727204297.07239: getting the remaining hosts for this loop 49915 1727204297.07240: done getting the remaining hosts for this loop 49915 1727204297.07243: getting the next task for host managed-node2 49915 1727204297.07248: done getting next task for host managed-node2 49915 1727204297.07250: ^ task is: TASK: Enable EPEL 7 49915 1727204297.07253: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204297.07256: getting variables 49915 1727204297.07258: in VariableManager get_vars() 49915 1727204297.07283: Calling all_inventory to load vars for managed-node2 49915 1727204297.07324: Calling groups_inventory to load vars for managed-node2 49915 1727204297.07328: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204297.07337: Calling all_plugins_play to load vars for managed-node2 49915 1727204297.07339: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204297.07342: Calling groups_plugins_play to load vars for managed-node2 49915 1727204297.07521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204297.07718: done with get_vars() 49915 1727204297.07731: done getting variables 49915 1727204297.07785: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 14:58:17 -0400 (0:00:00.024) 0:00:03.784 ***** 49915 1727204297.07811: entering _queue_task() for managed-node2/command 49915 1727204297.08170: worker is 1 (out of 1 available) 49915 1727204297.08183: exiting _queue_task() for managed-node2/command 49915 1727204297.08194: done queuing things up, now waiting for results queue to drain 49915 1727204297.08195: waiting for pending results... 49915 1727204297.08392: running TaskExecutor() for managed-node2/TASK: Enable EPEL 7 49915 1727204297.08484: in run() - task 028d2410-947f-dcd7-b5af-0000000000e2 49915 1727204297.08488: variable 'ansible_search_path' from source: unknown 49915 1727204297.08490: variable 'ansible_search_path' from source: unknown 49915 1727204297.08527: calling self._execute() 49915 1727204297.08608: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204297.08625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204297.08640: variable 'omit' from source: magic vars 49915 1727204297.09085: variable 'ansible_distribution' from source: facts 49915 1727204297.09102: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 49915 1727204297.09245: variable 'ansible_distribution_major_version' from source: facts 49915 1727204297.09258: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 49915 1727204297.09266: when evaluation is False, skipping this task 49915 1727204297.09272: _execute() done 49915 1727204297.09354: dumping result to json 49915 1727204297.09358: done dumping result, returning 49915 1727204297.09360: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 7 [028d2410-947f-dcd7-b5af-0000000000e2] 49915 1727204297.09363: sending task result for task 028d2410-947f-dcd7-b5af-0000000000e2 49915 1727204297.09425: done sending task result for task 028d2410-947f-dcd7-b5af-0000000000e2 49915 1727204297.09429: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 49915 1727204297.09502: no more pending results, returning what we have 49915 1727204297.09507: results queue empty 49915 1727204297.09508: checking for any_errors_fatal 49915 1727204297.09515: done checking for any_errors_fatal 49915 1727204297.09516: checking for max_fail_percentage 49915 1727204297.09517: done checking for max_fail_percentage 49915 1727204297.09518: checking to see if all hosts have failed and the running result is not ok 49915 1727204297.09520: done checking to see if all hosts have failed 49915 1727204297.09520: getting the remaining hosts for this loop 49915 1727204297.09522: done getting the remaining hosts for this loop 49915 1727204297.09526: getting the next task for host managed-node2 49915 1727204297.09534: done getting next task for host managed-node2 49915 1727204297.09536: ^ task is: TASK: Enable EPEL 8 49915 1727204297.09540: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204297.09544: getting variables 49915 1727204297.09546: in VariableManager get_vars() 49915 1727204297.09577: Calling all_inventory to load vars for managed-node2 49915 1727204297.09580: Calling groups_inventory to load vars for managed-node2 49915 1727204297.09584: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204297.09597: Calling all_plugins_play to load vars for managed-node2 49915 1727204297.09600: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204297.09602: Calling groups_plugins_play to load vars for managed-node2 49915 1727204297.10009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204297.10221: done with get_vars() 49915 1727204297.10231: done getting variables 49915 1727204297.10291: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 14:58:17 -0400 (0:00:00.025) 0:00:03.809 ***** 49915 1727204297.10328: entering _queue_task() for managed-node2/command 49915 1727204297.10606: worker is 1 (out of 1 available) 49915 1727204297.10619: exiting _queue_task() for managed-node2/command 49915 1727204297.10630: done queuing things up, now waiting for results queue to drain 49915 1727204297.10632: waiting for pending results... 49915 1727204297.10888: running TaskExecutor() for managed-node2/TASK: Enable EPEL 8 49915 1727204297.10950: in run() - task 028d2410-947f-dcd7-b5af-0000000000e3 49915 1727204297.10968: variable 'ansible_search_path' from source: unknown 49915 1727204297.10985: variable 'ansible_search_path' from source: unknown 49915 1727204297.11027: calling self._execute() 49915 1727204297.11103: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204297.11180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204297.11184: variable 'omit' from source: magic vars 49915 1727204297.11500: variable 'ansible_distribution' from source: facts 49915 1727204297.11526: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 49915 1727204297.11662: variable 'ansible_distribution_major_version' from source: facts 49915 1727204297.11678: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 49915 1727204297.11687: when evaluation is False, skipping this task 49915 1727204297.11695: _execute() done 49915 1727204297.11702: dumping result to json 49915 1727204297.11710: done dumping result, returning 49915 1727204297.11724: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 8 [028d2410-947f-dcd7-b5af-0000000000e3] 49915 1727204297.11736: sending task result for task 028d2410-947f-dcd7-b5af-0000000000e3 49915 1727204297.11923: done sending task result for task 028d2410-947f-dcd7-b5af-0000000000e3 49915 1727204297.11927: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 49915 1727204297.12015: no more pending results, returning what we have 49915 1727204297.12019: results queue empty 49915 1727204297.12020: checking for any_errors_fatal 49915 1727204297.12025: done checking for any_errors_fatal 49915 1727204297.12026: checking for max_fail_percentage 49915 1727204297.12028: done checking for max_fail_percentage 49915 1727204297.12029: checking to see if all hosts have failed and the running result is not ok 49915 1727204297.12030: done checking to see if all hosts have failed 49915 1727204297.12031: getting the remaining hosts for this loop 49915 1727204297.12033: done getting the remaining hosts for this loop 49915 1727204297.12037: getting the next task for host managed-node2 49915 1727204297.12047: done getting next task for host managed-node2 49915 1727204297.12050: ^ task is: TASK: Enable EPEL 6 49915 1727204297.12054: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204297.12058: getting variables 49915 1727204297.12060: in VariableManager get_vars() 49915 1727204297.12091: Calling all_inventory to load vars for managed-node2 49915 1727204297.12094: Calling groups_inventory to load vars for managed-node2 49915 1727204297.12097: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204297.12110: Calling all_plugins_play to load vars for managed-node2 49915 1727204297.12115: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204297.12118: Calling groups_plugins_play to load vars for managed-node2 49915 1727204297.12427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204297.12626: done with get_vars() 49915 1727204297.12636: done getting variables 49915 1727204297.12685: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 14:58:17 -0400 (0:00:00.023) 0:00:03.833 ***** 49915 1727204297.12714: entering _queue_task() for managed-node2/copy 49915 1727204297.12918: worker is 1 (out of 1 available) 49915 1727204297.12930: exiting _queue_task() for managed-node2/copy 49915 1727204297.12940: done queuing things up, now waiting for results queue to drain 49915 1727204297.12942: waiting for pending results... 49915 1727204297.13095: running TaskExecutor() for managed-node2/TASK: Enable EPEL 6 49915 1727204297.13160: in run() - task 028d2410-947f-dcd7-b5af-0000000000e5 49915 1727204297.13185: variable 'ansible_search_path' from source: unknown 49915 1727204297.13188: variable 'ansible_search_path' from source: unknown 49915 1727204297.13203: calling self._execute() 49915 1727204297.13256: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204297.13260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204297.13272: variable 'omit' from source: magic vars 49915 1727204297.13582: variable 'ansible_distribution' from source: facts 49915 1727204297.13592: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 49915 1727204297.13671: variable 'ansible_distribution_major_version' from source: facts 49915 1727204297.13674: Evaluated conditional (ansible_distribution_major_version == '6'): False 49915 1727204297.13680: when evaluation is False, skipping this task 49915 1727204297.13683: _execute() done 49915 1727204297.13686: dumping result to json 49915 1727204297.13690: done dumping result, returning 49915 1727204297.13696: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 6 [028d2410-947f-dcd7-b5af-0000000000e5] 49915 1727204297.13701: sending task result for task 028d2410-947f-dcd7-b5af-0000000000e5 49915 1727204297.13789: done sending task result for task 028d2410-947f-dcd7-b5af-0000000000e5 49915 1727204297.13792: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 49915 1727204297.13856: no more pending results, returning what we have 49915 1727204297.13859: results queue empty 49915 1727204297.13860: checking for any_errors_fatal 49915 1727204297.13862: done checking for any_errors_fatal 49915 1727204297.13863: checking for max_fail_percentage 49915 1727204297.13865: done checking for max_fail_percentage 49915 1727204297.13865: checking to see if all hosts have failed and the running result is not ok 49915 1727204297.13866: done checking to see if all hosts have failed 49915 1727204297.13867: getting the remaining hosts for this loop 49915 1727204297.13868: done getting the remaining hosts for this loop 49915 1727204297.13871: getting the next task for host managed-node2 49915 1727204297.13880: done getting next task for host managed-node2 49915 1727204297.13882: ^ task is: TASK: Set network provider to 'nm' 49915 1727204297.13884: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204297.13887: getting variables 49915 1727204297.13888: in VariableManager get_vars() 49915 1727204297.13913: Calling all_inventory to load vars for managed-node2 49915 1727204297.13915: Calling groups_inventory to load vars for managed-node2 49915 1727204297.13918: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204297.13926: Calling all_plugins_play to load vars for managed-node2 49915 1727204297.13929: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204297.13931: Calling groups_plugins_play to load vars for managed-node2 49915 1727204297.14059: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204297.14172: done with get_vars() 49915 1727204297.14180: done getting variables 49915 1727204297.14218: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_vlan_mtu_nm.yml:13 Tuesday 24 September 2024 14:58:17 -0400 (0:00:00.015) 0:00:03.848 ***** 49915 1727204297.14236: entering _queue_task() for managed-node2/set_fact 49915 1727204297.14419: worker is 1 (out of 1 available) 49915 1727204297.14431: exiting _queue_task() for managed-node2/set_fact 49915 1727204297.14442: done queuing things up, now waiting for results queue to drain 49915 1727204297.14443: waiting for pending results... 49915 1727204297.14593: running TaskExecutor() for managed-node2/TASK: Set network provider to 'nm' 49915 1727204297.14674: in run() - task 028d2410-947f-dcd7-b5af-000000000007 49915 1727204297.14684: variable 'ansible_search_path' from source: unknown 49915 1727204297.14712: calling self._execute() 49915 1727204297.14829: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204297.14833: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204297.14836: variable 'omit' from source: magic vars 49915 1727204297.14914: variable 'omit' from source: magic vars 49915 1727204297.14918: variable 'omit' from source: magic vars 49915 1727204297.14951: variable 'omit' from source: magic vars 49915 1727204297.14993: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204297.15183: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204297.15186: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204297.15188: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204297.15190: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204297.15192: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204297.15194: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204297.15196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204297.15231: Set connection var ansible_connection to ssh 49915 1727204297.15238: Set connection var ansible_shell_type to sh 49915 1727204297.15248: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204297.15261: Set connection var ansible_shell_executable to /bin/sh 49915 1727204297.15270: Set connection var ansible_timeout to 10 49915 1727204297.15284: Set connection var ansible_pipelining to False 49915 1727204297.15314: variable 'ansible_shell_executable' from source: unknown 49915 1727204297.15322: variable 'ansible_connection' from source: unknown 49915 1727204297.15329: variable 'ansible_module_compression' from source: unknown 49915 1727204297.15334: variable 'ansible_shell_type' from source: unknown 49915 1727204297.15340: variable 'ansible_shell_executable' from source: unknown 49915 1727204297.15346: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204297.15357: variable 'ansible_pipelining' from source: unknown 49915 1727204297.15363: variable 'ansible_timeout' from source: unknown 49915 1727204297.15369: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204297.15520: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204297.15538: variable 'omit' from source: magic vars 49915 1727204297.15548: starting attempt loop 49915 1727204297.15551: running the handler 49915 1727204297.15559: handler run complete 49915 1727204297.15567: attempt loop complete, returning result 49915 1727204297.15570: _execute() done 49915 1727204297.15572: dumping result to json 49915 1727204297.15575: done dumping result, returning 49915 1727204297.15587: done running TaskExecutor() for managed-node2/TASK: Set network provider to 'nm' [028d2410-947f-dcd7-b5af-000000000007] 49915 1727204297.15602: sending task result for task 028d2410-947f-dcd7-b5af-000000000007 49915 1727204297.15670: done sending task result for task 028d2410-947f-dcd7-b5af-000000000007 49915 1727204297.15673: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 49915 1727204297.15763: no more pending results, returning what we have 49915 1727204297.15766: results queue empty 49915 1727204297.15767: checking for any_errors_fatal 49915 1727204297.15770: done checking for any_errors_fatal 49915 1727204297.15771: checking for max_fail_percentage 49915 1727204297.15772: done checking for max_fail_percentage 49915 1727204297.15773: checking to see if all hosts have failed and the running result is not ok 49915 1727204297.15774: done checking to see if all hosts have failed 49915 1727204297.15774: getting the remaining hosts for this loop 49915 1727204297.15778: done getting the remaining hosts for this loop 49915 1727204297.15781: getting the next task for host managed-node2 49915 1727204297.15786: done getting next task for host managed-node2 49915 1727204297.15787: ^ task is: TASK: meta (flush_handlers) 49915 1727204297.15789: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204297.15793: getting variables 49915 1727204297.15794: in VariableManager get_vars() 49915 1727204297.15820: Calling all_inventory to load vars for managed-node2 49915 1727204297.15822: Calling groups_inventory to load vars for managed-node2 49915 1727204297.15825: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204297.15833: Calling all_plugins_play to load vars for managed-node2 49915 1727204297.15835: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204297.15838: Calling groups_plugins_play to load vars for managed-node2 49915 1727204297.15992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204297.16207: done with get_vars() 49915 1727204297.16219: done getting variables 49915 1727204297.16289: in VariableManager get_vars() 49915 1727204297.16297: Calling all_inventory to load vars for managed-node2 49915 1727204297.16299: Calling groups_inventory to load vars for managed-node2 49915 1727204297.16301: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204297.16306: Calling all_plugins_play to load vars for managed-node2 49915 1727204297.16309: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204297.16314: Calling groups_plugins_play to load vars for managed-node2 49915 1727204297.16446: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204297.16629: done with get_vars() 49915 1727204297.16642: done queuing things up, now waiting for results queue to drain 49915 1727204297.16643: results queue empty 49915 1727204297.16644: checking for any_errors_fatal 49915 1727204297.16646: done checking for any_errors_fatal 49915 1727204297.16647: checking for max_fail_percentage 49915 1727204297.16648: done checking for max_fail_percentage 49915 1727204297.16649: checking to see if all hosts have failed and the running result is not ok 49915 1727204297.16649: done checking to see if all hosts have failed 49915 1727204297.16650: getting the remaining hosts for this loop 49915 1727204297.16651: done getting the remaining hosts for this loop 49915 1727204297.16653: getting the next task for host managed-node2 49915 1727204297.16657: done getting next task for host managed-node2 49915 1727204297.16659: ^ task is: TASK: meta (flush_handlers) 49915 1727204297.16660: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204297.16668: getting variables 49915 1727204297.16669: in VariableManager get_vars() 49915 1727204297.16679: Calling all_inventory to load vars for managed-node2 49915 1727204297.16681: Calling groups_inventory to load vars for managed-node2 49915 1727204297.16683: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204297.16688: Calling all_plugins_play to load vars for managed-node2 49915 1727204297.16690: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204297.16692: Calling groups_plugins_play to load vars for managed-node2 49915 1727204297.16826: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204297.17169: done with get_vars() 49915 1727204297.17174: done getting variables 49915 1727204297.17207: in VariableManager get_vars() 49915 1727204297.17215: Calling all_inventory to load vars for managed-node2 49915 1727204297.17216: Calling groups_inventory to load vars for managed-node2 49915 1727204297.17217: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204297.17220: Calling all_plugins_play to load vars for managed-node2 49915 1727204297.17221: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204297.17223: Calling groups_plugins_play to load vars for managed-node2 49915 1727204297.17306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204297.17415: done with get_vars() 49915 1727204297.17422: done queuing things up, now waiting for results queue to drain 49915 1727204297.17423: results queue empty 49915 1727204297.17424: checking for any_errors_fatal 49915 1727204297.17425: done checking for any_errors_fatal 49915 1727204297.17425: checking for max_fail_percentage 49915 1727204297.17426: done checking for max_fail_percentage 49915 1727204297.17426: checking to see if all hosts have failed and the running result is not ok 49915 1727204297.17427: done checking to see if all hosts have failed 49915 1727204297.17427: getting the remaining hosts for this loop 49915 1727204297.17428: done getting the remaining hosts for this loop 49915 1727204297.17429: getting the next task for host managed-node2 49915 1727204297.17431: done getting next task for host managed-node2 49915 1727204297.17432: ^ task is: None 49915 1727204297.17433: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204297.17433: done queuing things up, now waiting for results queue to drain 49915 1727204297.17434: results queue empty 49915 1727204297.17434: checking for any_errors_fatal 49915 1727204297.17435: done checking for any_errors_fatal 49915 1727204297.17435: checking for max_fail_percentage 49915 1727204297.17436: done checking for max_fail_percentage 49915 1727204297.17436: checking to see if all hosts have failed and the running result is not ok 49915 1727204297.17436: done checking to see if all hosts have failed 49915 1727204297.17438: getting the next task for host managed-node2 49915 1727204297.17439: done getting next task for host managed-node2 49915 1727204297.17440: ^ task is: None 49915 1727204297.17441: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204297.17476: in VariableManager get_vars() 49915 1727204297.17497: done with get_vars() 49915 1727204297.17502: in VariableManager get_vars() 49915 1727204297.17513: done with get_vars() 49915 1727204297.17516: variable 'omit' from source: magic vars 49915 1727204297.17537: in VariableManager get_vars() 49915 1727204297.17546: done with get_vars() 49915 1727204297.17559: variable 'omit' from source: magic vars PLAY [Play for testing vlan mtu setting] *************************************** 49915 1727204297.17787: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 49915 1727204297.17809: getting the remaining hosts for this loop 49915 1727204297.17810: done getting the remaining hosts for this loop 49915 1727204297.17815: getting the next task for host managed-node2 49915 1727204297.17817: done getting next task for host managed-node2 49915 1727204297.17819: ^ task is: TASK: Gathering Facts 49915 1727204297.17820: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204297.17822: getting variables 49915 1727204297.17822: in VariableManager get_vars() 49915 1727204297.17830: Calling all_inventory to load vars for managed-node2 49915 1727204297.17832: Calling groups_inventory to load vars for managed-node2 49915 1727204297.17833: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204297.17836: Calling all_plugins_play to load vars for managed-node2 49915 1727204297.17845: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204297.17846: Calling groups_plugins_play to load vars for managed-node2 49915 1727204297.17929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204297.18066: done with get_vars() 49915 1727204297.18071: done getting variables 49915 1727204297.18099: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:3 Tuesday 24 September 2024 14:58:17 -0400 (0:00:00.038) 0:00:03.887 ***** 49915 1727204297.18116: entering _queue_task() for managed-node2/gather_facts 49915 1727204297.18335: worker is 1 (out of 1 available) 49915 1727204297.18346: exiting _queue_task() for managed-node2/gather_facts 49915 1727204297.18356: done queuing things up, now waiting for results queue to drain 49915 1727204297.18357: waiting for pending results... 49915 1727204297.18503: running TaskExecutor() for managed-node2/TASK: Gathering Facts 49915 1727204297.18559: in run() - task 028d2410-947f-dcd7-b5af-00000000010b 49915 1727204297.18570: variable 'ansible_search_path' from source: unknown 49915 1727204297.18604: calling self._execute() 49915 1727204297.18666: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204297.18669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204297.18679: variable 'omit' from source: magic vars 49915 1727204297.18952: variable 'ansible_distribution_major_version' from source: facts 49915 1727204297.18962: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204297.18967: variable 'omit' from source: magic vars 49915 1727204297.18986: variable 'omit' from source: magic vars 49915 1727204297.19046: variable 'omit' from source: magic vars 49915 1727204297.19081: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204297.19116: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204297.19281: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204297.19285: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204297.19288: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204297.19290: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204297.19292: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204297.19295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204297.19339: Set connection var ansible_connection to ssh 49915 1727204297.19347: Set connection var ansible_shell_type to sh 49915 1727204297.19359: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204297.19421: Set connection var ansible_shell_executable to /bin/sh 49915 1727204297.19425: Set connection var ansible_timeout to 10 49915 1727204297.19427: Set connection var ansible_pipelining to False 49915 1727204297.19432: variable 'ansible_shell_executable' from source: unknown 49915 1727204297.19439: variable 'ansible_connection' from source: unknown 49915 1727204297.19445: variable 'ansible_module_compression' from source: unknown 49915 1727204297.19451: variable 'ansible_shell_type' from source: unknown 49915 1727204297.19457: variable 'ansible_shell_executable' from source: unknown 49915 1727204297.19462: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204297.19468: variable 'ansible_pipelining' from source: unknown 49915 1727204297.19473: variable 'ansible_timeout' from source: unknown 49915 1727204297.19528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204297.19689: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204297.19706: variable 'omit' from source: magic vars 49915 1727204297.19718: starting attempt loop 49915 1727204297.19724: running the handler 49915 1727204297.19764: variable 'ansible_facts' from source: unknown 49915 1727204297.19855: _low_level_execute_command(): starting 49915 1727204297.19858: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49915 1727204297.20426: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204297.20442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204297.20456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204297.20511: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204297.20529: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204297.20618: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204297.22760: stdout chunk (state=3): >>>/root <<< 49915 1727204297.22848: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204297.22901: stderr chunk (state=3): >>><<< 49915 1727204297.22905: stdout chunk (state=3): >>><<< 49915 1727204297.23032: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204297.23038: _low_level_execute_command(): starting 49915 1727204297.23041: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204297.2292073-50207-211426930565017 `" && echo ansible-tmp-1727204297.2292073-50207-211426930565017="` echo /root/.ansible/tmp/ansible-tmp-1727204297.2292073-50207-211426930565017 `" ) && sleep 0' 49915 1727204297.23601: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204297.23614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 49915 1727204297.23637: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 49915 1727204297.23655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204297.23685: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204297.23697: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204297.23781: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204297.25905: stdout chunk (state=3): >>>ansible-tmp-1727204297.2292073-50207-211426930565017=/root/.ansible/tmp/ansible-tmp-1727204297.2292073-50207-211426930565017 <<< 49915 1727204297.26090: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204297.26095: stdout chunk (state=3): >>><<< 49915 1727204297.26098: stderr chunk (state=3): >>><<< 49915 1727204297.26285: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204297.2292073-50207-211426930565017=/root/.ansible/tmp/ansible-tmp-1727204297.2292073-50207-211426930565017 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204297.26289: variable 'ansible_module_compression' from source: unknown 49915 1727204297.26292: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49915ogiz3nec/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 49915 1727204297.26431: variable 'ansible_facts' from source: unknown 49915 1727204297.26758: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204297.2292073-50207-211426930565017/AnsiballZ_setup.py 49915 1727204297.27099: Sending initial data 49915 1727204297.27111: Sent initial data (154 bytes) 49915 1727204297.27981: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204297.27997: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 49915 1727204297.28009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204297.28078: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204297.28105: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204297.28209: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 49915 1727204297.30466: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49915 1727204297.30533: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49915 1727204297.30611: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49915ogiz3nec/tmpvjdhi_c1 /root/.ansible/tmp/ansible-tmp-1727204297.2292073-50207-211426930565017/AnsiballZ_setup.py <<< 49915 1727204297.30627: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204297.2292073-50207-211426930565017/AnsiballZ_setup.py" <<< 49915 1727204297.30708: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49915ogiz3nec/tmpvjdhi_c1" to remote "/root/.ansible/tmp/ansible-tmp-1727204297.2292073-50207-211426930565017/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204297.2292073-50207-211426930565017/AnsiballZ_setup.py" <<< 49915 1727204297.32710: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204297.32773: stderr chunk (state=3): >>><<< 49915 1727204297.32779: stdout chunk (state=3): >>><<< 49915 1727204297.32787: done transferring module to remote 49915 1727204297.32825: _low_level_execute_command(): starting 49915 1727204297.32828: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204297.2292073-50207-211426930565017/ /root/.ansible/tmp/ansible-tmp-1727204297.2292073-50207-211426930565017/AnsiballZ_setup.py && sleep 0' 49915 1727204297.33404: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204297.33408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204297.33411: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration <<< 49915 1727204297.33413: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204297.33417: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204297.33456: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204297.33487: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204297.33588: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 49915 1727204297.35787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204297.35796: stdout chunk (state=3): >>><<< 49915 1727204297.35806: stderr chunk (state=3): >>><<< 49915 1727204297.35820: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 49915 1727204297.35826: _low_level_execute_command(): starting 49915 1727204297.35832: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204297.2292073-50207-211426930565017/AnsiballZ_setup.py && sleep 0' 49915 1727204297.36285: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204297.36288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204297.36291: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204297.36293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204297.36346: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204297.36351: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204297.36471: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 49915 1727204298.11562: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_iscsi_iqn": "", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_is_chroot": false, "ansible_local": {}, "ansible_fips": false, "ansible_fibre_channel_wwn": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "58", "second": "17", "epoch": "1727204297", "epoch_int": "1727204297", "date": "2024-09-24", "time": "14:58:17", "iso8601_micro": "2024-09-24T18:58:17.764020Z", "iso8601": "2024-09-24T18:58:17Z", "iso8601_basic": "20240924T145817764020", "iso8601_basic_short": "20240924T145817", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2904, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 627, "free": 2904}, "nocache": {"free": 3282, "used": 249}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansib<<< 49915 1727204298.11611: stdout chunk (state=3): >>>le_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 884, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261762482176, "block_size": 4096, "block_total": 65519099, "block_available": 63906856, "block_used": 1612243, "inode_total": 131070960, "inode_available": 131027123, "inode_used": 43837, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_loadavg": {"1m": 0.60888671875, "5m": 0.6630859375, "15m": 0.38232421875}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "1<<< 49915 1727204298.11623: stdout chunk (state=3): >>>0.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 49915 1727204298.13590: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 49915 1727204298.13593: stdout chunk (state=3): >>><<< 49915 1727204298.13595: stderr chunk (state=3): >>><<< 49915 1727204298.13634: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_iscsi_iqn": "", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDCKfekAEZYR53Sflto5StFmxFelQM4lRrAAVLuV4unAO7AeBdRuM4bPUNwa4uCSoGHL62IHioaQMlV58injOOB+4msTnahmXn4RzK27CFdJyeG4+mbMcaasAZdetRv7YY0F+xmjTZhkn0uU4RWUFZe4Vul9OyoJimgehdfRcxTn1fiCYYbNZuijT9B8CZXqEdbP7q7S2v/t9Nm3ZGGWq1PR/kqP/oAYVW89pfJqGlqFNb5F78BsIqr8qKhrMfVFMJ0Pmg1ibxXuXtM2SW3wzFXT6ThQj8dF0/ZfqH8w98dAa25fAGalbHMFX2TrZS4sGe/M59ek3C5nSAO2LS3EaO856NjXKuhmeF3wt9FOoBACO8Er29y88fB6EZd0f9AKfrtM0y2tEdlxNxq3A2Wj5MAiiioEdsqSnxhhWsqlKdzHt2xKwnU+w0k9Sh94C95sZJ+5gjIn6TFjzqxylL/AiozwlFE2z1n44rfScbyNi7Ed37nderfVGW7nj+wWp7Gsas=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI5uKCdGb1mUx4VEjQb7HewXDRy/mfLHseVHU+f1n/3pAQVGZqPAbiH8Gt1sqO0Dfa4tslCvAqvuNi6RgfRKFiw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOh6fu957jE38mpLVIOfQlYW6ApDEuwpuJtRBPCnVg1K", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_is_chroot": false, "ansible_local": {}, "ansible_fips": false, "ansible_fibre_channel_wwn": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a3e031bc5ef3e8854b8deb3292792", "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "58", "second": "17", "epoch": "1727204297", "epoch_int": "1727204297", "date": "2024-09-24", "time": "14:58:17", "iso8601_micro": "2024-09-24T18:58:17.764020Z", "iso8601": "2024-09-24T18:58:17Z", "iso8601_basic": "20240924T145817764020", "iso8601_basic_short": "20240924T145817", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2904, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 627, "free": 2904}, "nocache": {"free": 3282, "used": 249}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_uuid": "ec2a3e03-1bc5-ef3e-8854-b8deb3292792", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 884, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261762482176, "block_size": 4096, "block_total": 65519099, "block_available": 63906856, "block_used": 1612243, "inode_total": 131070960, "inode_available": 131027123, "inode_used": 43837, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_loadavg": {"1m": 0.60888671875, "5m": 0.6630859375, "15m": 0.38232421875}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:e4ff:fe80:fb2d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.254", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:e4:80:fb:2d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.254"], "ansible_all_ipv6_addresses": ["fe80::8ff:e4ff:fe80:fb2d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.254", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:e4ff:fe80:fb2d"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 58442 10.31.13.254 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 58442 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 49915 1727204298.14139: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204297.2292073-50207-211426930565017/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49915 1727204298.14143: _low_level_execute_command(): starting 49915 1727204298.14145: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204297.2292073-50207-211426930565017/ > /dev/null 2>&1 && sleep 0' 49915 1727204298.14816: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204298.14831: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204298.14899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204298.15205: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204298.15271: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204298.15482: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204298.17361: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204298.17380: stdout chunk (state=3): >>><<< 49915 1727204298.17416: stderr chunk (state=3): >>><<< 49915 1727204298.17419: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204298.17422: handler run complete 49915 1727204298.17498: variable 'ansible_facts' from source: unknown 49915 1727204298.17704: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204298.17935: variable 'ansible_facts' from source: unknown 49915 1727204298.17946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204298.18023: attempt loop complete, returning result 49915 1727204298.18027: _execute() done 49915 1727204298.18029: dumping result to json 49915 1727204298.18064: done dumping result, returning 49915 1727204298.18071: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [028d2410-947f-dcd7-b5af-00000000010b] 49915 1727204298.18077: sending task result for task 028d2410-947f-dcd7-b5af-00000000010b 49915 1727204298.18682: done sending task result for task 028d2410-947f-dcd7-b5af-00000000010b 49915 1727204298.18685: WORKER PROCESS EXITING ok: [managed-node2] 49915 1727204298.18987: no more pending results, returning what we have 49915 1727204298.19008: results queue empty 49915 1727204298.19010: checking for any_errors_fatal 49915 1727204298.19011: done checking for any_errors_fatal 49915 1727204298.19020: checking for max_fail_percentage 49915 1727204298.19022: done checking for max_fail_percentage 49915 1727204298.19061: checking to see if all hosts have failed and the running result is not ok 49915 1727204298.19062: done checking to see if all hosts have failed 49915 1727204298.19063: getting the remaining hosts for this loop 49915 1727204298.19080: done getting the remaining hosts for this loop 49915 1727204298.19085: getting the next task for host managed-node2 49915 1727204298.19091: done getting next task for host managed-node2 49915 1727204298.19093: ^ task is: TASK: meta (flush_handlers) 49915 1727204298.19095: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204298.19106: getting variables 49915 1727204298.19108: in VariableManager get_vars() 49915 1727204298.19141: Calling all_inventory to load vars for managed-node2 49915 1727204298.19144: Calling groups_inventory to load vars for managed-node2 49915 1727204298.19147: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204298.19156: Calling all_plugins_play to load vars for managed-node2 49915 1727204298.19159: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204298.19163: Calling groups_plugins_play to load vars for managed-node2 49915 1727204298.19340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204298.19509: done with get_vars() 49915 1727204298.19518: done getting variables 49915 1727204298.19598: in VariableManager get_vars() 49915 1727204298.19612: Calling all_inventory to load vars for managed-node2 49915 1727204298.19614: Calling groups_inventory to load vars for managed-node2 49915 1727204298.19616: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204298.19621: Calling all_plugins_play to load vars for managed-node2 49915 1727204298.19627: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204298.19630: Calling groups_plugins_play to load vars for managed-node2 49915 1727204298.19757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204298.19930: done with get_vars() 49915 1727204298.19943: done queuing things up, now waiting for results queue to drain 49915 1727204298.19945: results queue empty 49915 1727204298.19946: checking for any_errors_fatal 49915 1727204298.19948: done checking for any_errors_fatal 49915 1727204298.19949: checking for max_fail_percentage 49915 1727204298.19950: done checking for max_fail_percentage 49915 1727204298.19951: checking to see if all hosts have failed and the running result is not ok 49915 1727204298.19954: done checking to see if all hosts have failed 49915 1727204298.19955: getting the remaining hosts for this loop 49915 1727204298.19956: done getting the remaining hosts for this loop 49915 1727204298.19958: getting the next task for host managed-node2 49915 1727204298.19961: done getting next task for host managed-node2 49915 1727204298.19964: ^ task is: TASK: Include the task 'show_interfaces.yml' 49915 1727204298.19965: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204298.19967: getting variables 49915 1727204298.19968: in VariableManager get_vars() 49915 1727204298.19984: Calling all_inventory to load vars for managed-node2 49915 1727204298.19986: Calling groups_inventory to load vars for managed-node2 49915 1727204298.19988: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204298.19996: Calling all_plugins_play to load vars for managed-node2 49915 1727204298.20001: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204298.20004: Calling groups_plugins_play to load vars for managed-node2 49915 1727204298.20137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204298.20334: done with get_vars() 49915 1727204298.20343: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:10 Tuesday 24 September 2024 14:58:18 -0400 (0:00:01.023) 0:00:04.910 ***** 49915 1727204298.20421: entering _queue_task() for managed-node2/include_tasks 49915 1727204298.20774: worker is 1 (out of 1 available) 49915 1727204298.20791: exiting _queue_task() for managed-node2/include_tasks 49915 1727204298.20803: done queuing things up, now waiting for results queue to drain 49915 1727204298.20804: waiting for pending results... 49915 1727204298.21110: running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' 49915 1727204298.21829: in run() - task 028d2410-947f-dcd7-b5af-00000000000b 49915 1727204298.21832: variable 'ansible_search_path' from source: unknown 49915 1727204298.21835: calling self._execute() 49915 1727204298.22063: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204298.22133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204298.22211: variable 'omit' from source: magic vars 49915 1727204298.23668: variable 'ansible_distribution_major_version' from source: facts 49915 1727204298.23672: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204298.23674: _execute() done 49915 1727204298.23679: dumping result to json 49915 1727204298.23699: done dumping result, returning 49915 1727204298.23710: done running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' [028d2410-947f-dcd7-b5af-00000000000b] 49915 1727204298.23724: sending task result for task 028d2410-947f-dcd7-b5af-00000000000b 49915 1727204298.24051: done sending task result for task 028d2410-947f-dcd7-b5af-00000000000b 49915 1727204298.24054: WORKER PROCESS EXITING 49915 1727204298.24097: no more pending results, returning what we have 49915 1727204298.24102: in VariableManager get_vars() 49915 1727204298.24149: Calling all_inventory to load vars for managed-node2 49915 1727204298.24152: Calling groups_inventory to load vars for managed-node2 49915 1727204298.24154: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204298.24174: Calling all_plugins_play to load vars for managed-node2 49915 1727204298.24180: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204298.24184: Calling groups_plugins_play to load vars for managed-node2 49915 1727204298.24484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204298.24890: done with get_vars() 49915 1727204298.24898: variable 'ansible_search_path' from source: unknown 49915 1727204298.24912: we have included files to process 49915 1727204298.24913: generating all_blocks data 49915 1727204298.24914: done generating all_blocks data 49915 1727204298.24915: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 49915 1727204298.24916: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 49915 1727204298.24919: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 49915 1727204298.25032: in VariableManager get_vars() 49915 1727204298.25049: done with get_vars() 49915 1727204298.25227: done processing included file 49915 1727204298.25229: iterating over new_blocks loaded from include file 49915 1727204298.25231: in VariableManager get_vars() 49915 1727204298.25269: done with get_vars() 49915 1727204298.25272: filtering new block on tags 49915 1727204298.25298: done filtering new block on tags 49915 1727204298.25304: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node2 49915 1727204298.25311: extending task lists for all hosts with included blocks 49915 1727204298.27463: done extending task lists 49915 1727204298.27465: done processing included files 49915 1727204298.27465: results queue empty 49915 1727204298.27466: checking for any_errors_fatal 49915 1727204298.27467: done checking for any_errors_fatal 49915 1727204298.27468: checking for max_fail_percentage 49915 1727204298.27469: done checking for max_fail_percentage 49915 1727204298.27470: checking to see if all hosts have failed and the running result is not ok 49915 1727204298.27471: done checking to see if all hosts have failed 49915 1727204298.27472: getting the remaining hosts for this loop 49915 1727204298.27473: done getting the remaining hosts for this loop 49915 1727204298.27477: getting the next task for host managed-node2 49915 1727204298.27482: done getting next task for host managed-node2 49915 1727204298.27483: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 49915 1727204298.27486: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204298.27488: getting variables 49915 1727204298.27489: in VariableManager get_vars() 49915 1727204298.27505: Calling all_inventory to load vars for managed-node2 49915 1727204298.27507: Calling groups_inventory to load vars for managed-node2 49915 1727204298.27509: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204298.27517: Calling all_plugins_play to load vars for managed-node2 49915 1727204298.27520: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204298.27522: Calling groups_plugins_play to load vars for managed-node2 49915 1727204298.27656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204298.27845: done with get_vars() 49915 1727204298.27854: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 14:58:18 -0400 (0:00:00.075) 0:00:04.985 ***** 49915 1727204298.27927: entering _queue_task() for managed-node2/include_tasks 49915 1727204298.28224: worker is 1 (out of 1 available) 49915 1727204298.28235: exiting _queue_task() for managed-node2/include_tasks 49915 1727204298.28247: done queuing things up, now waiting for results queue to drain 49915 1727204298.28248: waiting for pending results... 49915 1727204298.28992: running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' 49915 1727204298.28998: in run() - task 028d2410-947f-dcd7-b5af-000000000120 49915 1727204298.29001: variable 'ansible_search_path' from source: unknown 49915 1727204298.29004: variable 'ansible_search_path' from source: unknown 49915 1727204298.29182: calling self._execute() 49915 1727204298.29260: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204298.29288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204298.29303: variable 'omit' from source: magic vars 49915 1727204298.29861: variable 'ansible_distribution_major_version' from source: facts 49915 1727204298.29886: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204298.29897: _execute() done 49915 1727204298.29907: dumping result to json 49915 1727204298.29917: done dumping result, returning 49915 1727204298.29928: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' [028d2410-947f-dcd7-b5af-000000000120] 49915 1727204298.29943: sending task result for task 028d2410-947f-dcd7-b5af-000000000120 49915 1727204298.30070: no more pending results, returning what we have 49915 1727204298.30080: in VariableManager get_vars() 49915 1727204298.30129: Calling all_inventory to load vars for managed-node2 49915 1727204298.30133: Calling groups_inventory to load vars for managed-node2 49915 1727204298.30135: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204298.30198: Calling all_plugins_play to load vars for managed-node2 49915 1727204298.30202: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204298.30206: Calling groups_plugins_play to load vars for managed-node2 49915 1727204298.30527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204298.30735: done with get_vars() 49915 1727204298.30742: variable 'ansible_search_path' from source: unknown 49915 1727204298.30744: variable 'ansible_search_path' from source: unknown 49915 1727204298.30767: done sending task result for task 028d2410-947f-dcd7-b5af-000000000120 49915 1727204298.30771: WORKER PROCESS EXITING 49915 1727204298.30793: we have included files to process 49915 1727204298.30794: generating all_blocks data 49915 1727204298.30795: done generating all_blocks data 49915 1727204298.30796: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 49915 1727204298.30797: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 49915 1727204298.30799: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 49915 1727204298.31179: done processing included file 49915 1727204298.31181: iterating over new_blocks loaded from include file 49915 1727204298.31182: in VariableManager get_vars() 49915 1727204298.31200: done with get_vars() 49915 1727204298.31202: filtering new block on tags 49915 1727204298.31215: done filtering new block on tags 49915 1727204298.31217: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node2 49915 1727204298.31220: extending task lists for all hosts with included blocks 49915 1727204298.31282: done extending task lists 49915 1727204298.31283: done processing included files 49915 1727204298.31283: results queue empty 49915 1727204298.31284: checking for any_errors_fatal 49915 1727204298.31286: done checking for any_errors_fatal 49915 1727204298.31286: checking for max_fail_percentage 49915 1727204298.31287: done checking for max_fail_percentage 49915 1727204298.31287: checking to see if all hosts have failed and the running result is not ok 49915 1727204298.31288: done checking to see if all hosts have failed 49915 1727204298.31288: getting the remaining hosts for this loop 49915 1727204298.31289: done getting the remaining hosts for this loop 49915 1727204298.31291: getting the next task for host managed-node2 49915 1727204298.31293: done getting next task for host managed-node2 49915 1727204298.31294: ^ task is: TASK: Gather current interface info 49915 1727204298.31296: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204298.31298: getting variables 49915 1727204298.31298: in VariableManager get_vars() 49915 1727204298.31306: Calling all_inventory to load vars for managed-node2 49915 1727204298.31308: Calling groups_inventory to load vars for managed-node2 49915 1727204298.31309: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204298.31314: Calling all_plugins_play to load vars for managed-node2 49915 1727204298.31316: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204298.31318: Calling groups_plugins_play to load vars for managed-node2 49915 1727204298.31400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204298.31528: done with get_vars() 49915 1727204298.31534: done getting variables 49915 1727204298.31580: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 14:58:18 -0400 (0:00:00.036) 0:00:05.022 ***** 49915 1727204298.31606: entering _queue_task() for managed-node2/command 49915 1727204298.31851: worker is 1 (out of 1 available) 49915 1727204298.31862: exiting _queue_task() for managed-node2/command 49915 1727204298.31872: done queuing things up, now waiting for results queue to drain 49915 1727204298.31873: waiting for pending results... 49915 1727204298.32202: running TaskExecutor() for managed-node2/TASK: Gather current interface info 49915 1727204298.32230: in run() - task 028d2410-947f-dcd7-b5af-0000000001ff 49915 1727204298.32246: variable 'ansible_search_path' from source: unknown 49915 1727204298.32251: variable 'ansible_search_path' from source: unknown 49915 1727204298.32292: calling self._execute() 49915 1727204298.32374: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204298.32387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204298.32401: variable 'omit' from source: magic vars 49915 1727204298.32760: variable 'ansible_distribution_major_version' from source: facts 49915 1727204298.32792: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204298.32803: variable 'omit' from source: magic vars 49915 1727204298.32929: variable 'omit' from source: magic vars 49915 1727204298.33170: variable 'omit' from source: magic vars 49915 1727204298.33173: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204298.33177: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204298.33180: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204298.33189: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204298.33202: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204298.33302: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204298.33315: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204298.33324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204298.33583: Set connection var ansible_connection to ssh 49915 1727204298.33586: Set connection var ansible_shell_type to sh 49915 1727204298.33588: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204298.33591: Set connection var ansible_shell_executable to /bin/sh 49915 1727204298.33592: Set connection var ansible_timeout to 10 49915 1727204298.33594: Set connection var ansible_pipelining to False 49915 1727204298.33880: variable 'ansible_shell_executable' from source: unknown 49915 1727204298.33884: variable 'ansible_connection' from source: unknown 49915 1727204298.33886: variable 'ansible_module_compression' from source: unknown 49915 1727204298.33888: variable 'ansible_shell_type' from source: unknown 49915 1727204298.33890: variable 'ansible_shell_executable' from source: unknown 49915 1727204298.33892: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204298.33894: variable 'ansible_pipelining' from source: unknown 49915 1727204298.33896: variable 'ansible_timeout' from source: unknown 49915 1727204298.33897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204298.33900: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204298.33903: variable 'omit' from source: magic vars 49915 1727204298.33906: starting attempt loop 49915 1727204298.33908: running the handler 49915 1727204298.34003: _low_level_execute_command(): starting 49915 1727204298.34061: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49915 1727204298.34925: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 49915 1727204298.35015: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204298.35087: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204298.35199: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204298.36901: stdout chunk (state=3): >>>/root <<< 49915 1727204298.37035: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204298.37051: stdout chunk (state=3): >>><<< 49915 1727204298.37064: stderr chunk (state=3): >>><<< 49915 1727204298.37094: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204298.37112: _low_level_execute_command(): starting 49915 1727204298.37121: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204298.3710034-50337-146585497870452 `" && echo ansible-tmp-1727204298.3710034-50337-146585497870452="` echo /root/.ansible/tmp/ansible-tmp-1727204298.3710034-50337-146585497870452 `" ) && sleep 0' 49915 1727204298.37568: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204298.37585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration <<< 49915 1727204298.37599: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204298.37664: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204298.37667: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204298.37784: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204298.39710: stdout chunk (state=3): >>>ansible-tmp-1727204298.3710034-50337-146585497870452=/root/.ansible/tmp/ansible-tmp-1727204298.3710034-50337-146585497870452 <<< 49915 1727204298.39840: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204298.39843: stdout chunk (state=3): >>><<< 49915 1727204298.39848: stderr chunk (state=3): >>><<< 49915 1727204298.39862: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204298.3710034-50337-146585497870452=/root/.ansible/tmp/ansible-tmp-1727204298.3710034-50337-146585497870452 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204298.39890: variable 'ansible_module_compression' from source: unknown 49915 1727204298.39930: ANSIBALLZ: Using generic lock for ansible.legacy.command 49915 1727204298.39933: ANSIBALLZ: Acquiring lock 49915 1727204298.39936: ANSIBALLZ: Lock acquired: 140698012046288 49915 1727204298.39938: ANSIBALLZ: Creating module 49915 1727204298.49064: ANSIBALLZ: Writing module into payload 49915 1727204298.49135: ANSIBALLZ: Writing module 49915 1727204298.49152: ANSIBALLZ: Renaming module 49915 1727204298.49157: ANSIBALLZ: Done creating module 49915 1727204298.49172: variable 'ansible_facts' from source: unknown 49915 1727204298.49221: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204298.3710034-50337-146585497870452/AnsiballZ_command.py 49915 1727204298.49324: Sending initial data 49915 1727204298.49327: Sent initial data (156 bytes) 49915 1727204298.49739: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204298.49772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204298.49777: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 49915 1727204298.49780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204298.49782: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204298.49785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204298.49834: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204298.49837: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204298.49840: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204298.49920: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204298.51566: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49915 1727204298.51671: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49915 1727204298.51728: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49915ogiz3nec/tmp0mp61h6y /root/.ansible/tmp/ansible-tmp-1727204298.3710034-50337-146585497870452/AnsiballZ_command.py <<< 49915 1727204298.51733: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204298.3710034-50337-146585497870452/AnsiballZ_command.py" <<< 49915 1727204298.51808: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49915ogiz3nec/tmp0mp61h6y" to remote "/root/.ansible/tmp/ansible-tmp-1727204298.3710034-50337-146585497870452/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204298.3710034-50337-146585497870452/AnsiballZ_command.py" <<< 49915 1727204298.52643: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204298.52703: stderr chunk (state=3): >>><<< 49915 1727204298.52707: stdout chunk (state=3): >>><<< 49915 1727204298.52792: done transferring module to remote 49915 1727204298.52794: _low_level_execute_command(): starting 49915 1727204298.52797: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204298.3710034-50337-146585497870452/ /root/.ansible/tmp/ansible-tmp-1727204298.3710034-50337-146585497870452/AnsiballZ_command.py && sleep 0' 49915 1727204298.53468: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204298.53472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204298.53565: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204298.53620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204298.55557: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204298.55561: stdout chunk (state=3): >>><<< 49915 1727204298.55563: stderr chunk (state=3): >>><<< 49915 1727204298.55700: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204298.55703: _low_level_execute_command(): starting 49915 1727204298.55706: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204298.3710034-50337-146585497870452/AnsiballZ_command.py && sleep 0' 49915 1727204298.56499: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204298.56517: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204298.56541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204298.56660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204298.56663: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204298.56691: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204298.56749: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204298.56772: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204298.56881: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204298.72440: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:58:18.719756", "end": "2024-09-24 14:58:18.723099", "delta": "0:00:00.003343", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 49915 1727204298.73961: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 49915 1727204298.73974: stderr chunk (state=3): >>><<< 49915 1727204298.73979: stdout chunk (state=3): >>><<< 49915 1727204298.73999: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:58:18.719756", "end": "2024-09-24 14:58:18.723099", "delta": "0:00:00.003343", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 49915 1727204298.74027: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204298.3710034-50337-146585497870452/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49915 1727204298.74034: _low_level_execute_command(): starting 49915 1727204298.74039: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204298.3710034-50337-146585497870452/ > /dev/null 2>&1 && sleep 0' 49915 1727204298.74446: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204298.74450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204298.74483: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 49915 1727204298.74486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration <<< 49915 1727204298.74488: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 49915 1727204298.74494: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204298.74544: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204298.74547: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204298.74626: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204298.76580: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204298.76584: stdout chunk (state=3): >>><<< 49915 1727204298.76586: stderr chunk (state=3): >>><<< 49915 1727204298.76588: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204298.76590: handler run complete 49915 1727204298.76592: Evaluated conditional (False): False 49915 1727204298.76594: attempt loop complete, returning result 49915 1727204298.76596: _execute() done 49915 1727204298.76598: dumping result to json 49915 1727204298.76600: done dumping result, returning 49915 1727204298.76602: done running TaskExecutor() for managed-node2/TASK: Gather current interface info [028d2410-947f-dcd7-b5af-0000000001ff] 49915 1727204298.76603: sending task result for task 028d2410-947f-dcd7-b5af-0000000001ff 49915 1727204298.76677: done sending task result for task 028d2410-947f-dcd7-b5af-0000000001ff 49915 1727204298.76681: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003343", "end": "2024-09-24 14:58:18.723099", "rc": 0, "start": "2024-09-24 14:58:18.719756" } STDOUT: bonding_masters eth0 lo 49915 1727204298.76762: no more pending results, returning what we have 49915 1727204298.76765: results queue empty 49915 1727204298.76766: checking for any_errors_fatal 49915 1727204298.76767: done checking for any_errors_fatal 49915 1727204298.76768: checking for max_fail_percentage 49915 1727204298.76769: done checking for max_fail_percentage 49915 1727204298.76770: checking to see if all hosts have failed and the running result is not ok 49915 1727204298.76771: done checking to see if all hosts have failed 49915 1727204298.76772: getting the remaining hosts for this loop 49915 1727204298.76774: done getting the remaining hosts for this loop 49915 1727204298.76780: getting the next task for host managed-node2 49915 1727204298.76787: done getting next task for host managed-node2 49915 1727204298.76790: ^ task is: TASK: Set current_interfaces 49915 1727204298.76794: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204298.76797: getting variables 49915 1727204298.76799: in VariableManager get_vars() 49915 1727204298.76845: Calling all_inventory to load vars for managed-node2 49915 1727204298.76848: Calling groups_inventory to load vars for managed-node2 49915 1727204298.76850: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204298.76862: Calling all_plugins_play to load vars for managed-node2 49915 1727204298.76865: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204298.76868: Calling groups_plugins_play to load vars for managed-node2 49915 1727204298.77354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204298.77654: done with get_vars() 49915 1727204298.77673: done getting variables 49915 1727204298.77741: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 14:58:18 -0400 (0:00:00.461) 0:00:05.484 ***** 49915 1727204298.77780: entering _queue_task() for managed-node2/set_fact 49915 1727204298.78502: worker is 1 (out of 1 available) 49915 1727204298.78510: exiting _queue_task() for managed-node2/set_fact 49915 1727204298.78523: done queuing things up, now waiting for results queue to drain 49915 1727204298.78524: waiting for pending results... 49915 1727204298.78745: running TaskExecutor() for managed-node2/TASK: Set current_interfaces 49915 1727204298.78841: in run() - task 028d2410-947f-dcd7-b5af-000000000200 49915 1727204298.78846: variable 'ansible_search_path' from source: unknown 49915 1727204298.78849: variable 'ansible_search_path' from source: unknown 49915 1727204298.78863: calling self._execute() 49915 1727204298.78955: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204298.78972: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204298.78988: variable 'omit' from source: magic vars 49915 1727204298.79390: variable 'ansible_distribution_major_version' from source: facts 49915 1727204298.79416: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204298.79426: variable 'omit' from source: magic vars 49915 1727204298.79498: variable 'omit' from source: magic vars 49915 1727204298.79591: variable '_current_interfaces' from source: set_fact 49915 1727204298.79708: variable 'omit' from source: magic vars 49915 1727204298.79739: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204298.79766: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204298.79795: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204298.79806: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204298.79818: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204298.79843: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204298.79846: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204298.79848: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204298.79916: Set connection var ansible_connection to ssh 49915 1727204298.79921: Set connection var ansible_shell_type to sh 49915 1727204298.79924: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204298.79933: Set connection var ansible_shell_executable to /bin/sh 49915 1727204298.79937: Set connection var ansible_timeout to 10 49915 1727204298.79946: Set connection var ansible_pipelining to False 49915 1727204298.79961: variable 'ansible_shell_executable' from source: unknown 49915 1727204298.79964: variable 'ansible_connection' from source: unknown 49915 1727204298.79966: variable 'ansible_module_compression' from source: unknown 49915 1727204298.79969: variable 'ansible_shell_type' from source: unknown 49915 1727204298.79971: variable 'ansible_shell_executable' from source: unknown 49915 1727204298.79973: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204298.79978: variable 'ansible_pipelining' from source: unknown 49915 1727204298.79980: variable 'ansible_timeout' from source: unknown 49915 1727204298.79985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204298.80282: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204298.80286: variable 'omit' from source: magic vars 49915 1727204298.80289: starting attempt loop 49915 1727204298.80291: running the handler 49915 1727204298.80293: handler run complete 49915 1727204298.80295: attempt loop complete, returning result 49915 1727204298.80297: _execute() done 49915 1727204298.80299: dumping result to json 49915 1727204298.80300: done dumping result, returning 49915 1727204298.80303: done running TaskExecutor() for managed-node2/TASK: Set current_interfaces [028d2410-947f-dcd7-b5af-000000000200] 49915 1727204298.80305: sending task result for task 028d2410-947f-dcd7-b5af-000000000200 49915 1727204298.80364: done sending task result for task 028d2410-947f-dcd7-b5af-000000000200 49915 1727204298.80368: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 49915 1727204298.80428: no more pending results, returning what we have 49915 1727204298.80431: results queue empty 49915 1727204298.80432: checking for any_errors_fatal 49915 1727204298.80439: done checking for any_errors_fatal 49915 1727204298.80439: checking for max_fail_percentage 49915 1727204298.80441: done checking for max_fail_percentage 49915 1727204298.80442: checking to see if all hosts have failed and the running result is not ok 49915 1727204298.80443: done checking to see if all hosts have failed 49915 1727204298.80444: getting the remaining hosts for this loop 49915 1727204298.80446: done getting the remaining hosts for this loop 49915 1727204298.80450: getting the next task for host managed-node2 49915 1727204298.80458: done getting next task for host managed-node2 49915 1727204298.80461: ^ task is: TASK: Show current_interfaces 49915 1727204298.80464: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204298.80467: getting variables 49915 1727204298.80469: in VariableManager get_vars() 49915 1727204298.80515: Calling all_inventory to load vars for managed-node2 49915 1727204298.80521: Calling groups_inventory to load vars for managed-node2 49915 1727204298.80524: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204298.80535: Calling all_plugins_play to load vars for managed-node2 49915 1727204298.80538: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204298.80540: Calling groups_plugins_play to load vars for managed-node2 49915 1727204298.80883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204298.81074: done with get_vars() 49915 1727204298.81086: done getting variables 49915 1727204298.81178: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 14:58:18 -0400 (0:00:00.034) 0:00:05.518 ***** 49915 1727204298.81208: entering _queue_task() for managed-node2/debug 49915 1727204298.81210: Creating lock for debug 49915 1727204298.81471: worker is 1 (out of 1 available) 49915 1727204298.81683: exiting _queue_task() for managed-node2/debug 49915 1727204298.81691: done queuing things up, now waiting for results queue to drain 49915 1727204298.81693: waiting for pending results... 49915 1727204298.81739: running TaskExecutor() for managed-node2/TASK: Show current_interfaces 49915 1727204298.81833: in run() - task 028d2410-947f-dcd7-b5af-000000000121 49915 1727204298.81850: variable 'ansible_search_path' from source: unknown 49915 1727204298.81857: variable 'ansible_search_path' from source: unknown 49915 1727204298.81897: calling self._execute() 49915 1727204298.81981: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204298.82034: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204298.82048: variable 'omit' from source: magic vars 49915 1727204298.82746: variable 'ansible_distribution_major_version' from source: facts 49915 1727204298.83042: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204298.83045: variable 'omit' from source: magic vars 49915 1727204298.83048: variable 'omit' from source: magic vars 49915 1727204298.83100: variable 'current_interfaces' from source: set_fact 49915 1727204298.83183: variable 'omit' from source: magic vars 49915 1727204298.83229: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204298.83318: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204298.83345: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204298.83395: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204298.83419: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204298.83456: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204298.83465: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204298.83477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204298.83571: Set connection var ansible_connection to ssh 49915 1727204298.83587: Set connection var ansible_shell_type to sh 49915 1727204298.83600: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204298.83620: Set connection var ansible_shell_executable to /bin/sh 49915 1727204298.83629: Set connection var ansible_timeout to 10 49915 1727204298.83639: Set connection var ansible_pipelining to False 49915 1727204298.83664: variable 'ansible_shell_executable' from source: unknown 49915 1727204298.83672: variable 'ansible_connection' from source: unknown 49915 1727204298.83681: variable 'ansible_module_compression' from source: unknown 49915 1727204298.83693: variable 'ansible_shell_type' from source: unknown 49915 1727204298.83699: variable 'ansible_shell_executable' from source: unknown 49915 1727204298.83706: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204298.83713: variable 'ansible_pipelining' from source: unknown 49915 1727204298.83719: variable 'ansible_timeout' from source: unknown 49915 1727204298.83725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204298.83868: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204298.83889: variable 'omit' from source: magic vars 49915 1727204298.83898: starting attempt loop 49915 1727204298.83980: running the handler 49915 1727204298.83983: handler run complete 49915 1727204298.83986: attempt loop complete, returning result 49915 1727204298.83988: _execute() done 49915 1727204298.83990: dumping result to json 49915 1727204298.83992: done dumping result, returning 49915 1727204298.83998: done running TaskExecutor() for managed-node2/TASK: Show current_interfaces [028d2410-947f-dcd7-b5af-000000000121] 49915 1727204298.84012: sending task result for task 028d2410-947f-dcd7-b5af-000000000121 49915 1727204298.84113: done sending task result for task 028d2410-947f-dcd7-b5af-000000000121 49915 1727204298.84116: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 49915 1727204298.84168: no more pending results, returning what we have 49915 1727204298.84171: results queue empty 49915 1727204298.84172: checking for any_errors_fatal 49915 1727204298.84182: done checking for any_errors_fatal 49915 1727204298.84183: checking for max_fail_percentage 49915 1727204298.84185: done checking for max_fail_percentage 49915 1727204298.84186: checking to see if all hosts have failed and the running result is not ok 49915 1727204298.84187: done checking to see if all hosts have failed 49915 1727204298.84188: getting the remaining hosts for this loop 49915 1727204298.84189: done getting the remaining hosts for this loop 49915 1727204298.84193: getting the next task for host managed-node2 49915 1727204298.84201: done getting next task for host managed-node2 49915 1727204298.84204: ^ task is: TASK: Include the task 'manage_test_interface.yml' 49915 1727204298.84206: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204298.84210: getting variables 49915 1727204298.84211: in VariableManager get_vars() 49915 1727204298.84250: Calling all_inventory to load vars for managed-node2 49915 1727204298.84253: Calling groups_inventory to load vars for managed-node2 49915 1727204298.84255: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204298.84264: Calling all_plugins_play to load vars for managed-node2 49915 1727204298.84266: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204298.84269: Calling groups_plugins_play to load vars for managed-node2 49915 1727204298.84429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204298.84550: done with get_vars() 49915 1727204298.84558: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:12 Tuesday 24 September 2024 14:58:18 -0400 (0:00:00.034) 0:00:05.552 ***** 49915 1727204298.84624: entering _queue_task() for managed-node2/include_tasks 49915 1727204298.84817: worker is 1 (out of 1 available) 49915 1727204298.84831: exiting _queue_task() for managed-node2/include_tasks 49915 1727204298.84842: done queuing things up, now waiting for results queue to drain 49915 1727204298.84844: waiting for pending results... 49915 1727204298.84995: running TaskExecutor() for managed-node2/TASK: Include the task 'manage_test_interface.yml' 49915 1727204298.85041: in run() - task 028d2410-947f-dcd7-b5af-00000000000c 49915 1727204298.85051: variable 'ansible_search_path' from source: unknown 49915 1727204298.85085: calling self._execute() 49915 1727204298.85145: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204298.85149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204298.85158: variable 'omit' from source: magic vars 49915 1727204298.85473: variable 'ansible_distribution_major_version' from source: facts 49915 1727204298.85489: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204298.85492: _execute() done 49915 1727204298.85495: dumping result to json 49915 1727204298.85514: done dumping result, returning 49915 1727204298.85518: done running TaskExecutor() for managed-node2/TASK: Include the task 'manage_test_interface.yml' [028d2410-947f-dcd7-b5af-00000000000c] 49915 1727204298.85539: sending task result for task 028d2410-947f-dcd7-b5af-00000000000c 49915 1727204298.85623: done sending task result for task 028d2410-947f-dcd7-b5af-00000000000c 49915 1727204298.85626: WORKER PROCESS EXITING 49915 1727204298.85672: no more pending results, returning what we have 49915 1727204298.85787: in VariableManager get_vars() 49915 1727204298.85826: Calling all_inventory to load vars for managed-node2 49915 1727204298.85834: Calling groups_inventory to load vars for managed-node2 49915 1727204298.85836: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204298.85848: Calling all_plugins_play to load vars for managed-node2 49915 1727204298.85851: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204298.85854: Calling groups_plugins_play to load vars for managed-node2 49915 1727204298.86070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204298.86266: done with get_vars() 49915 1727204298.86279: variable 'ansible_search_path' from source: unknown 49915 1727204298.86291: we have included files to process 49915 1727204298.86292: generating all_blocks data 49915 1727204298.86295: done generating all_blocks data 49915 1727204298.86299: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 49915 1727204298.86300: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 49915 1727204298.86303: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 49915 1727204298.87513: in VariableManager get_vars() 49915 1727204298.87537: done with get_vars() 49915 1727204298.87826: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 49915 1727204298.88412: done processing included file 49915 1727204298.88414: iterating over new_blocks loaded from include file 49915 1727204298.88415: in VariableManager get_vars() 49915 1727204298.88432: done with get_vars() 49915 1727204298.88434: filtering new block on tags 49915 1727204298.88463: done filtering new block on tags 49915 1727204298.88465: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed-node2 49915 1727204298.88470: extending task lists for all hosts with included blocks 49915 1727204298.90544: done extending task lists 49915 1727204298.90546: done processing included files 49915 1727204298.90547: results queue empty 49915 1727204298.90547: checking for any_errors_fatal 49915 1727204298.90569: done checking for any_errors_fatal 49915 1727204298.90569: checking for max_fail_percentage 49915 1727204298.90570: done checking for max_fail_percentage 49915 1727204298.90571: checking to see if all hosts have failed and the running result is not ok 49915 1727204298.90572: done checking to see if all hosts have failed 49915 1727204298.90573: getting the remaining hosts for this loop 49915 1727204298.90574: done getting the remaining hosts for this loop 49915 1727204298.90583: getting the next task for host managed-node2 49915 1727204298.90587: done getting next task for host managed-node2 49915 1727204298.90589: ^ task is: TASK: Ensure state in ["present", "absent"] 49915 1727204298.90591: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204298.90593: getting variables 49915 1727204298.90594: in VariableManager get_vars() 49915 1727204298.90607: Calling all_inventory to load vars for managed-node2 49915 1727204298.90609: Calling groups_inventory to load vars for managed-node2 49915 1727204298.90611: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204298.90617: Calling all_plugins_play to load vars for managed-node2 49915 1727204298.90619: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204298.90621: Calling groups_plugins_play to load vars for managed-node2 49915 1727204298.90783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204298.91397: done with get_vars() 49915 1727204298.91407: done getting variables 49915 1727204298.91483: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Tuesday 24 September 2024 14:58:18 -0400 (0:00:00.068) 0:00:05.621 ***** 49915 1727204298.91509: entering _queue_task() for managed-node2/fail 49915 1727204298.91511: Creating lock for fail 49915 1727204298.92227: worker is 1 (out of 1 available) 49915 1727204298.92238: exiting _queue_task() for managed-node2/fail 49915 1727204298.92251: done queuing things up, now waiting for results queue to drain 49915 1727204298.92252: waiting for pending results... 49915 1727204298.92900: running TaskExecutor() for managed-node2/TASK: Ensure state in ["present", "absent"] 49915 1727204298.93187: in run() - task 028d2410-947f-dcd7-b5af-00000000021b 49915 1727204298.93191: variable 'ansible_search_path' from source: unknown 49915 1727204298.93194: variable 'ansible_search_path' from source: unknown 49915 1727204298.93198: calling self._execute() 49915 1727204298.93260: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204298.93302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204298.93315: variable 'omit' from source: magic vars 49915 1727204298.93897: variable 'ansible_distribution_major_version' from source: facts 49915 1727204298.93964: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204298.94135: variable 'state' from source: include params 49915 1727204298.94149: Evaluated conditional (state not in ["present", "absent"]): False 49915 1727204298.94165: when evaluation is False, skipping this task 49915 1727204298.94174: _execute() done 49915 1727204298.94184: dumping result to json 49915 1727204298.94191: done dumping result, returning 49915 1727204298.94200: done running TaskExecutor() for managed-node2/TASK: Ensure state in ["present", "absent"] [028d2410-947f-dcd7-b5af-00000000021b] 49915 1727204298.94209: sending task result for task 028d2410-947f-dcd7-b5af-00000000021b skipping: [managed-node2] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 49915 1727204298.94458: no more pending results, returning what we have 49915 1727204298.94463: results queue empty 49915 1727204298.94464: checking for any_errors_fatal 49915 1727204298.94465: done checking for any_errors_fatal 49915 1727204298.94466: checking for max_fail_percentage 49915 1727204298.94467: done checking for max_fail_percentage 49915 1727204298.94468: checking to see if all hosts have failed and the running result is not ok 49915 1727204298.94469: done checking to see if all hosts have failed 49915 1727204298.94470: getting the remaining hosts for this loop 49915 1727204298.94471: done getting the remaining hosts for this loop 49915 1727204298.94477: getting the next task for host managed-node2 49915 1727204298.94489: done getting next task for host managed-node2 49915 1727204298.94492: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 49915 1727204298.94495: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204298.94498: getting variables 49915 1727204298.94501: in VariableManager get_vars() 49915 1727204298.94544: Calling all_inventory to load vars for managed-node2 49915 1727204298.94547: Calling groups_inventory to load vars for managed-node2 49915 1727204298.94549: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204298.94564: Calling all_plugins_play to load vars for managed-node2 49915 1727204298.94567: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204298.94569: Calling groups_plugins_play to load vars for managed-node2 49915 1727204298.94955: done sending task result for task 028d2410-947f-dcd7-b5af-00000000021b 49915 1727204298.94958: WORKER PROCESS EXITING 49915 1727204298.94984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204298.95212: done with get_vars() 49915 1727204298.95222: done getting variables 49915 1727204298.95284: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Tuesday 24 September 2024 14:58:18 -0400 (0:00:00.037) 0:00:05.659 ***** 49915 1727204298.95313: entering _queue_task() for managed-node2/fail 49915 1727204298.95947: worker is 1 (out of 1 available) 49915 1727204298.95958: exiting _queue_task() for managed-node2/fail 49915 1727204298.95970: done queuing things up, now waiting for results queue to drain 49915 1727204298.95972: waiting for pending results... 49915 1727204298.96427: running TaskExecutor() for managed-node2/TASK: Ensure type in ["dummy", "tap", "veth"] 49915 1727204298.96608: in run() - task 028d2410-947f-dcd7-b5af-00000000021c 49915 1727204298.96633: variable 'ansible_search_path' from source: unknown 49915 1727204298.96642: variable 'ansible_search_path' from source: unknown 49915 1727204298.96735: calling self._execute() 49915 1727204298.97045: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204298.97049: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204298.97053: variable 'omit' from source: magic vars 49915 1727204298.97496: variable 'ansible_distribution_major_version' from source: facts 49915 1727204298.97513: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204298.97695: variable 'type' from source: play vars 49915 1727204298.97709: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 49915 1727204298.97717: when evaluation is False, skipping this task 49915 1727204298.97725: _execute() done 49915 1727204298.97732: dumping result to json 49915 1727204298.97744: done dumping result, returning 49915 1727204298.97754: done running TaskExecutor() for managed-node2/TASK: Ensure type in ["dummy", "tap", "veth"] [028d2410-947f-dcd7-b5af-00000000021c] 49915 1727204298.97764: sending task result for task 028d2410-947f-dcd7-b5af-00000000021c 49915 1727204298.98081: done sending task result for task 028d2410-947f-dcd7-b5af-00000000021c 49915 1727204298.98085: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 49915 1727204298.98124: no more pending results, returning what we have 49915 1727204298.98128: results queue empty 49915 1727204298.98129: checking for any_errors_fatal 49915 1727204298.98135: done checking for any_errors_fatal 49915 1727204298.98136: checking for max_fail_percentage 49915 1727204298.98137: done checking for max_fail_percentage 49915 1727204298.98138: checking to see if all hosts have failed and the running result is not ok 49915 1727204298.98139: done checking to see if all hosts have failed 49915 1727204298.98140: getting the remaining hosts for this loop 49915 1727204298.98141: done getting the remaining hosts for this loop 49915 1727204298.98144: getting the next task for host managed-node2 49915 1727204298.98149: done getting next task for host managed-node2 49915 1727204298.98152: ^ task is: TASK: Include the task 'show_interfaces.yml' 49915 1727204298.98155: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204298.98159: getting variables 49915 1727204298.98160: in VariableManager get_vars() 49915 1727204298.98199: Calling all_inventory to load vars for managed-node2 49915 1727204298.98202: Calling groups_inventory to load vars for managed-node2 49915 1727204298.98204: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204298.98214: Calling all_plugins_play to load vars for managed-node2 49915 1727204298.98217: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204298.98220: Calling groups_plugins_play to load vars for managed-node2 49915 1727204298.98478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204298.98666: done with get_vars() 49915 1727204298.98678: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Tuesday 24 September 2024 14:58:18 -0400 (0:00:00.034) 0:00:05.693 ***** 49915 1727204298.98765: entering _queue_task() for managed-node2/include_tasks 49915 1727204298.99194: worker is 1 (out of 1 available) 49915 1727204298.99202: exiting _queue_task() for managed-node2/include_tasks 49915 1727204298.99211: done queuing things up, now waiting for results queue to drain 49915 1727204298.99212: waiting for pending results... 49915 1727204298.99263: running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' 49915 1727204298.99370: in run() - task 028d2410-947f-dcd7-b5af-00000000021d 49915 1727204298.99397: variable 'ansible_search_path' from source: unknown 49915 1727204298.99405: variable 'ansible_search_path' from source: unknown 49915 1727204298.99447: calling self._execute() 49915 1727204298.99527: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204298.99547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204298.99563: variable 'omit' from source: magic vars 49915 1727204298.99927: variable 'ansible_distribution_major_version' from source: facts 49915 1727204298.99955: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204298.99966: _execute() done 49915 1727204299.00041: dumping result to json 49915 1727204299.00057: done dumping result, returning 49915 1727204299.00064: done running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' [028d2410-947f-dcd7-b5af-00000000021d] 49915 1727204299.00068: sending task result for task 028d2410-947f-dcd7-b5af-00000000021d 49915 1727204299.00242: no more pending results, returning what we have 49915 1727204299.00247: in VariableManager get_vars() 49915 1727204299.00308: Calling all_inventory to load vars for managed-node2 49915 1727204299.00311: Calling groups_inventory to load vars for managed-node2 49915 1727204299.00314: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204299.00335: Calling all_plugins_play to load vars for managed-node2 49915 1727204299.00339: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204299.00342: Calling groups_plugins_play to load vars for managed-node2 49915 1727204299.00753: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204299.01374: done sending task result for task 028d2410-947f-dcd7-b5af-00000000021d 49915 1727204299.01380: WORKER PROCESS EXITING 49915 1727204299.01433: done with get_vars() 49915 1727204299.01443: variable 'ansible_search_path' from source: unknown 49915 1727204299.01444: variable 'ansible_search_path' from source: unknown 49915 1727204299.01483: we have included files to process 49915 1727204299.01485: generating all_blocks data 49915 1727204299.01486: done generating all_blocks data 49915 1727204299.01490: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 49915 1727204299.01491: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 49915 1727204299.01496: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 49915 1727204299.01647: in VariableManager get_vars() 49915 1727204299.01800: done with get_vars() 49915 1727204299.02008: done processing included file 49915 1727204299.02010: iterating over new_blocks loaded from include file 49915 1727204299.02012: in VariableManager get_vars() 49915 1727204299.02034: done with get_vars() 49915 1727204299.02036: filtering new block on tags 49915 1727204299.02056: done filtering new block on tags 49915 1727204299.02058: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node2 49915 1727204299.02064: extending task lists for all hosts with included blocks 49915 1727204299.03362: done extending task lists 49915 1727204299.03363: done processing included files 49915 1727204299.03364: results queue empty 49915 1727204299.03364: checking for any_errors_fatal 49915 1727204299.03389: done checking for any_errors_fatal 49915 1727204299.03390: checking for max_fail_percentage 49915 1727204299.03391: done checking for max_fail_percentage 49915 1727204299.03392: checking to see if all hosts have failed and the running result is not ok 49915 1727204299.03393: done checking to see if all hosts have failed 49915 1727204299.03393: getting the remaining hosts for this loop 49915 1727204299.03394: done getting the remaining hosts for this loop 49915 1727204299.03397: getting the next task for host managed-node2 49915 1727204299.03401: done getting next task for host managed-node2 49915 1727204299.03403: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 49915 1727204299.03405: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204299.03407: getting variables 49915 1727204299.03408: in VariableManager get_vars() 49915 1727204299.03420: Calling all_inventory to load vars for managed-node2 49915 1727204299.03422: Calling groups_inventory to load vars for managed-node2 49915 1727204299.03424: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204299.03429: Calling all_plugins_play to load vars for managed-node2 49915 1727204299.03431: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204299.03434: Calling groups_plugins_play to load vars for managed-node2 49915 1727204299.03602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204299.03981: done with get_vars() 49915 1727204299.03994: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 14:58:19 -0400 (0:00:00.052) 0:00:05.746 ***** 49915 1727204299.04066: entering _queue_task() for managed-node2/include_tasks 49915 1727204299.04796: worker is 1 (out of 1 available) 49915 1727204299.04805: exiting _queue_task() for managed-node2/include_tasks 49915 1727204299.04815: done queuing things up, now waiting for results queue to drain 49915 1727204299.04816: waiting for pending results... 49915 1727204299.05085: running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' 49915 1727204299.05293: in run() - task 028d2410-947f-dcd7-b5af-000000000314 49915 1727204299.05317: variable 'ansible_search_path' from source: unknown 49915 1727204299.05325: variable 'ansible_search_path' from source: unknown 49915 1727204299.05449: calling self._execute() 49915 1727204299.05540: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204299.05552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204299.05567: variable 'omit' from source: magic vars 49915 1727204299.06071: variable 'ansible_distribution_major_version' from source: facts 49915 1727204299.06096: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204299.06109: _execute() done 49915 1727204299.06187: dumping result to json 49915 1727204299.06190: done dumping result, returning 49915 1727204299.06193: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' [028d2410-947f-dcd7-b5af-000000000314] 49915 1727204299.06195: sending task result for task 028d2410-947f-dcd7-b5af-000000000314 49915 1727204299.06262: done sending task result for task 028d2410-947f-dcd7-b5af-000000000314 49915 1727204299.06265: WORKER PROCESS EXITING 49915 1727204299.06320: no more pending results, returning what we have 49915 1727204299.06326: in VariableManager get_vars() 49915 1727204299.06370: Calling all_inventory to load vars for managed-node2 49915 1727204299.06372: Calling groups_inventory to load vars for managed-node2 49915 1727204299.06374: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204299.06390: Calling all_plugins_play to load vars for managed-node2 49915 1727204299.06394: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204299.06397: Calling groups_plugins_play to load vars for managed-node2 49915 1727204299.07029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204299.07227: done with get_vars() 49915 1727204299.07235: variable 'ansible_search_path' from source: unknown 49915 1727204299.07236: variable 'ansible_search_path' from source: unknown 49915 1727204299.07337: we have included files to process 49915 1727204299.07338: generating all_blocks data 49915 1727204299.07340: done generating all_blocks data 49915 1727204299.07341: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 49915 1727204299.07342: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 49915 1727204299.07345: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 49915 1727204299.07872: done processing included file 49915 1727204299.07877: iterating over new_blocks loaded from include file 49915 1727204299.07879: in VariableManager get_vars() 49915 1727204299.07900: done with get_vars() 49915 1727204299.07902: filtering new block on tags 49915 1727204299.07923: done filtering new block on tags 49915 1727204299.07926: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node2 49915 1727204299.07931: extending task lists for all hosts with included blocks 49915 1727204299.08133: done extending task lists 49915 1727204299.08135: done processing included files 49915 1727204299.08135: results queue empty 49915 1727204299.08136: checking for any_errors_fatal 49915 1727204299.08139: done checking for any_errors_fatal 49915 1727204299.08140: checking for max_fail_percentage 49915 1727204299.08141: done checking for max_fail_percentage 49915 1727204299.08142: checking to see if all hosts have failed and the running result is not ok 49915 1727204299.08143: done checking to see if all hosts have failed 49915 1727204299.08144: getting the remaining hosts for this loop 49915 1727204299.08145: done getting the remaining hosts for this loop 49915 1727204299.08147: getting the next task for host managed-node2 49915 1727204299.08152: done getting next task for host managed-node2 49915 1727204299.08154: ^ task is: TASK: Gather current interface info 49915 1727204299.08158: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204299.08160: getting variables 49915 1727204299.08161: in VariableManager get_vars() 49915 1727204299.08181: Calling all_inventory to load vars for managed-node2 49915 1727204299.08184: Calling groups_inventory to load vars for managed-node2 49915 1727204299.08187: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204299.08192: Calling all_plugins_play to load vars for managed-node2 49915 1727204299.08195: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204299.08198: Calling groups_plugins_play to load vars for managed-node2 49915 1727204299.08352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204299.08583: done with get_vars() 49915 1727204299.08596: done getting variables 49915 1727204299.08653: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 14:58:19 -0400 (0:00:00.046) 0:00:05.793 ***** 49915 1727204299.08694: entering _queue_task() for managed-node2/command 49915 1727204299.09187: worker is 1 (out of 1 available) 49915 1727204299.09200: exiting _queue_task() for managed-node2/command 49915 1727204299.09216: done queuing things up, now waiting for results queue to drain 49915 1727204299.09218: waiting for pending results... 49915 1727204299.09517: running TaskExecutor() for managed-node2/TASK: Gather current interface info 49915 1727204299.09623: in run() - task 028d2410-947f-dcd7-b5af-00000000034b 49915 1727204299.09646: variable 'ansible_search_path' from source: unknown 49915 1727204299.09661: variable 'ansible_search_path' from source: unknown 49915 1727204299.09782: calling self._execute() 49915 1727204299.09838: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204299.09857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204299.09883: variable 'omit' from source: magic vars 49915 1727204299.10682: variable 'ansible_distribution_major_version' from source: facts 49915 1727204299.10720: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204299.10816: variable 'omit' from source: magic vars 49915 1727204299.10865: variable 'omit' from source: magic vars 49915 1727204299.11013: variable 'omit' from source: magic vars 49915 1727204299.11214: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204299.11361: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204299.11380: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204299.11586: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204299.11590: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204299.11593: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204299.11595: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204299.11597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204299.12126: Set connection var ansible_connection to ssh 49915 1727204299.12128: Set connection var ansible_shell_type to sh 49915 1727204299.12130: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204299.12132: Set connection var ansible_shell_executable to /bin/sh 49915 1727204299.12134: Set connection var ansible_timeout to 10 49915 1727204299.12136: Set connection var ansible_pipelining to False 49915 1727204299.12137: variable 'ansible_shell_executable' from source: unknown 49915 1727204299.12139: variable 'ansible_connection' from source: unknown 49915 1727204299.12141: variable 'ansible_module_compression' from source: unknown 49915 1727204299.12142: variable 'ansible_shell_type' from source: unknown 49915 1727204299.12144: variable 'ansible_shell_executable' from source: unknown 49915 1727204299.12146: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204299.12148: variable 'ansible_pipelining' from source: unknown 49915 1727204299.12149: variable 'ansible_timeout' from source: unknown 49915 1727204299.12151: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204299.12322: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204299.12462: variable 'omit' from source: magic vars 49915 1727204299.12470: starting attempt loop 49915 1727204299.12477: running the handler 49915 1727204299.12495: _low_level_execute_command(): starting 49915 1727204299.12504: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49915 1727204299.13793: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204299.13859: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204299.13889: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204299.13903: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204299.14011: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204299.15751: stdout chunk (state=3): >>>/root <<< 49915 1727204299.15983: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204299.15988: stdout chunk (state=3): >>><<< 49915 1727204299.15990: stderr chunk (state=3): >>><<< 49915 1727204299.15998: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204299.16014: _low_level_execute_command(): starting 49915 1727204299.16181: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204299.1599956-50487-13281294840734 `" && echo ansible-tmp-1727204299.1599956-50487-13281294840734="` echo /root/.ansible/tmp/ansible-tmp-1727204299.1599956-50487-13281294840734 `" ) && sleep 0' 49915 1727204299.17277: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 49915 1727204299.17367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204299.17401: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204299.17509: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204299.19447: stdout chunk (state=3): >>>ansible-tmp-1727204299.1599956-50487-13281294840734=/root/.ansible/tmp/ansible-tmp-1727204299.1599956-50487-13281294840734 <<< 49915 1727204299.19557: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204299.19602: stderr chunk (state=3): >>><<< 49915 1727204299.19633: stdout chunk (state=3): >>><<< 49915 1727204299.19700: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204299.1599956-50487-13281294840734=/root/.ansible/tmp/ansible-tmp-1727204299.1599956-50487-13281294840734 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204299.19799: variable 'ansible_module_compression' from source: unknown 49915 1727204299.20134: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49915ogiz3nec/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 49915 1727204299.20183: variable 'ansible_facts' from source: unknown 49915 1727204299.20316: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204299.1599956-50487-13281294840734/AnsiballZ_command.py 49915 1727204299.20967: Sending initial data 49915 1727204299.20970: Sent initial data (155 bytes) 49915 1727204299.21583: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204299.21891: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204299.22110: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204299.22184: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204299.23778: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 49915 1727204299.23786: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 49915 1727204299.23793: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 49915 1727204299.23805: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49915 1727204299.23897: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49915 1727204299.23976: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49915ogiz3nec/tmpskrd16or /root/.ansible/tmp/ansible-tmp-1727204299.1599956-50487-13281294840734/AnsiballZ_command.py <<< 49915 1727204299.23980: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204299.1599956-50487-13281294840734/AnsiballZ_command.py" <<< 49915 1727204299.24041: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49915ogiz3nec/tmpskrd16or" to remote "/root/.ansible/tmp/ansible-tmp-1727204299.1599956-50487-13281294840734/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204299.1599956-50487-13281294840734/AnsiballZ_command.py" <<< 49915 1727204299.25043: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204299.25047: stdout chunk (state=3): >>><<< 49915 1727204299.25049: stderr chunk (state=3): >>><<< 49915 1727204299.25051: done transferring module to remote 49915 1727204299.25053: _low_level_execute_command(): starting 49915 1727204299.25056: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204299.1599956-50487-13281294840734/ /root/.ansible/tmp/ansible-tmp-1727204299.1599956-50487-13281294840734/AnsiballZ_command.py && sleep 0' 49915 1727204299.25916: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204299.26096: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204299.26121: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204299.27939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204299.27983: stderr chunk (state=3): >>><<< 49915 1727204299.27986: stdout chunk (state=3): >>><<< 49915 1727204299.28005: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204299.28013: _low_level_execute_command(): starting 49915 1727204299.28016: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204299.1599956-50487-13281294840734/AnsiballZ_command.py && sleep 0' 49915 1727204299.28633: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204299.28677: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204299.28681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204299.28683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204299.28686: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 49915 1727204299.28688: stderr chunk (state=3): >>>debug2: match not found <<< 49915 1727204299.28690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204299.28780: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 49915 1727204299.28783: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 49915 1727204299.28785: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 49915 1727204299.28787: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204299.28789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204299.28791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204299.28793: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 49915 1727204299.28795: stderr chunk (state=3): >>>debug2: match found <<< 49915 1727204299.28797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204299.28822: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204299.28838: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204299.28849: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204299.28963: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204299.44509: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:58:19.440445", "end": "2024-09-24 14:58:19.443760", "delta": "0:00:00.003315", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 49915 1727204299.46158: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 49915 1727204299.46163: stdout chunk (state=3): >>><<< 49915 1727204299.46168: stderr chunk (state=3): >>><<< 49915 1727204299.46237: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:58:19.440445", "end": "2024-09-24 14:58:19.443760", "delta": "0:00:00.003315", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 49915 1727204299.46362: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204299.1599956-50487-13281294840734/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49915 1727204299.46365: _low_level_execute_command(): starting 49915 1727204299.46368: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204299.1599956-50487-13281294840734/ > /dev/null 2>&1 && sleep 0' 49915 1727204299.47109: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204299.47133: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204299.47173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204299.47250: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 49915 1727204299.47268: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204299.47306: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204299.47326: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204299.47374: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204299.47451: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204299.49481: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204299.49485: stdout chunk (state=3): >>><<< 49915 1727204299.49493: stderr chunk (state=3): >>><<< 49915 1727204299.49497: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204299.49508: handler run complete 49915 1727204299.49510: Evaluated conditional (False): False 49915 1727204299.49514: attempt loop complete, returning result 49915 1727204299.49516: _execute() done 49915 1727204299.49522: dumping result to json 49915 1727204299.49525: done dumping result, returning 49915 1727204299.49527: done running TaskExecutor() for managed-node2/TASK: Gather current interface info [028d2410-947f-dcd7-b5af-00000000034b] 49915 1727204299.49529: sending task result for task 028d2410-947f-dcd7-b5af-00000000034b 49915 1727204299.49634: done sending task result for task 028d2410-947f-dcd7-b5af-00000000034b 49915 1727204299.49637: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003315", "end": "2024-09-24 14:58:19.443760", "rc": 0, "start": "2024-09-24 14:58:19.440445" } STDOUT: bonding_masters eth0 lo 49915 1727204299.49813: no more pending results, returning what we have 49915 1727204299.49816: results queue empty 49915 1727204299.49817: checking for any_errors_fatal 49915 1727204299.49818: done checking for any_errors_fatal 49915 1727204299.49819: checking for max_fail_percentage 49915 1727204299.49821: done checking for max_fail_percentage 49915 1727204299.49822: checking to see if all hosts have failed and the running result is not ok 49915 1727204299.49823: done checking to see if all hosts have failed 49915 1727204299.49823: getting the remaining hosts for this loop 49915 1727204299.49825: done getting the remaining hosts for this loop 49915 1727204299.49829: getting the next task for host managed-node2 49915 1727204299.49835: done getting next task for host managed-node2 49915 1727204299.49838: ^ task is: TASK: Set current_interfaces 49915 1727204299.49843: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204299.49847: getting variables 49915 1727204299.49849: in VariableManager get_vars() 49915 1727204299.50466: Calling all_inventory to load vars for managed-node2 49915 1727204299.50469: Calling groups_inventory to load vars for managed-node2 49915 1727204299.50471: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204299.50483: Calling all_plugins_play to load vars for managed-node2 49915 1727204299.50485: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204299.50488: Calling groups_plugins_play to load vars for managed-node2 49915 1727204299.50648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204299.50858: done with get_vars() 49915 1727204299.50869: done getting variables 49915 1727204299.50930: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 14:58:19 -0400 (0:00:00.422) 0:00:06.215 ***** 49915 1727204299.50961: entering _queue_task() for managed-node2/set_fact 49915 1727204299.51256: worker is 1 (out of 1 available) 49915 1727204299.51273: exiting _queue_task() for managed-node2/set_fact 49915 1727204299.51388: done queuing things up, now waiting for results queue to drain 49915 1727204299.51390: waiting for pending results... 49915 1727204299.51695: running TaskExecutor() for managed-node2/TASK: Set current_interfaces 49915 1727204299.51702: in run() - task 028d2410-947f-dcd7-b5af-00000000034c 49915 1727204299.51706: variable 'ansible_search_path' from source: unknown 49915 1727204299.51709: variable 'ansible_search_path' from source: unknown 49915 1727204299.51766: calling self._execute() 49915 1727204299.51890: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204299.51909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204299.51929: variable 'omit' from source: magic vars 49915 1727204299.52388: variable 'ansible_distribution_major_version' from source: facts 49915 1727204299.52405: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204299.52423: variable 'omit' from source: magic vars 49915 1727204299.52496: variable 'omit' from source: magic vars 49915 1727204299.52680: variable '_current_interfaces' from source: set_fact 49915 1727204299.52726: variable 'omit' from source: magic vars 49915 1727204299.52793: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204299.52845: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204299.52873: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204299.52898: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204299.52923: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204299.52957: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204299.52965: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204299.53030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204299.53080: Set connection var ansible_connection to ssh 49915 1727204299.53088: Set connection var ansible_shell_type to sh 49915 1727204299.53099: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204299.53118: Set connection var ansible_shell_executable to /bin/sh 49915 1727204299.53129: Set connection var ansible_timeout to 10 49915 1727204299.53201: Set connection var ansible_pipelining to False 49915 1727204299.53230: variable 'ansible_shell_executable' from source: unknown 49915 1727204299.53238: variable 'ansible_connection' from source: unknown 49915 1727204299.53248: variable 'ansible_module_compression' from source: unknown 49915 1727204299.53254: variable 'ansible_shell_type' from source: unknown 49915 1727204299.53260: variable 'ansible_shell_executable' from source: unknown 49915 1727204299.53266: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204299.53354: variable 'ansible_pipelining' from source: unknown 49915 1727204299.53358: variable 'ansible_timeout' from source: unknown 49915 1727204299.53361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204299.53440: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204299.53456: variable 'omit' from source: magic vars 49915 1727204299.53469: starting attempt loop 49915 1727204299.53477: running the handler 49915 1727204299.53492: handler run complete 49915 1727204299.53506: attempt loop complete, returning result 49915 1727204299.53515: _execute() done 49915 1727204299.53521: dumping result to json 49915 1727204299.53529: done dumping result, returning 49915 1727204299.53573: done running TaskExecutor() for managed-node2/TASK: Set current_interfaces [028d2410-947f-dcd7-b5af-00000000034c] 49915 1727204299.53577: sending task result for task 028d2410-947f-dcd7-b5af-00000000034c ok: [managed-node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 49915 1727204299.53738: no more pending results, returning what we have 49915 1727204299.53742: results queue empty 49915 1727204299.53743: checking for any_errors_fatal 49915 1727204299.53750: done checking for any_errors_fatal 49915 1727204299.53751: checking for max_fail_percentage 49915 1727204299.53753: done checking for max_fail_percentage 49915 1727204299.53754: checking to see if all hosts have failed and the running result is not ok 49915 1727204299.53755: done checking to see if all hosts have failed 49915 1727204299.53756: getting the remaining hosts for this loop 49915 1727204299.53757: done getting the remaining hosts for this loop 49915 1727204299.53762: getting the next task for host managed-node2 49915 1727204299.53772: done getting next task for host managed-node2 49915 1727204299.53775: ^ task is: TASK: Show current_interfaces 49915 1727204299.53782: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204299.53786: getting variables 49915 1727204299.53788: in VariableManager get_vars() 49915 1727204299.53832: Calling all_inventory to load vars for managed-node2 49915 1727204299.53835: Calling groups_inventory to load vars for managed-node2 49915 1727204299.53837: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204299.53848: Calling all_plugins_play to load vars for managed-node2 49915 1727204299.53850: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204299.53853: Calling groups_plugins_play to load vars for managed-node2 49915 1727204299.54375: done sending task result for task 028d2410-947f-dcd7-b5af-00000000034c 49915 1727204299.54381: WORKER PROCESS EXITING 49915 1727204299.54404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204299.54838: done with get_vars() 49915 1727204299.54849: done getting variables 49915 1727204299.55121: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 14:58:19 -0400 (0:00:00.041) 0:00:06.257 ***** 49915 1727204299.55155: entering _queue_task() for managed-node2/debug 49915 1727204299.55745: worker is 1 (out of 1 available) 49915 1727204299.55760: exiting _queue_task() for managed-node2/debug 49915 1727204299.55774: done queuing things up, now waiting for results queue to drain 49915 1727204299.55777: waiting for pending results... 49915 1727204299.56078: running TaskExecutor() for managed-node2/TASK: Show current_interfaces 49915 1727204299.56189: in run() - task 028d2410-947f-dcd7-b5af-000000000315 49915 1727204299.56202: variable 'ansible_search_path' from source: unknown 49915 1727204299.56205: variable 'ansible_search_path' from source: unknown 49915 1727204299.56242: calling self._execute() 49915 1727204299.56330: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204299.56336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204299.56346: variable 'omit' from source: magic vars 49915 1727204299.56701: variable 'ansible_distribution_major_version' from source: facts 49915 1727204299.56716: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204299.56719: variable 'omit' from source: magic vars 49915 1727204299.56764: variable 'omit' from source: magic vars 49915 1727204299.56857: variable 'current_interfaces' from source: set_fact 49915 1727204299.56888: variable 'omit' from source: magic vars 49915 1727204299.56927: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204299.56965: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204299.56986: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204299.57002: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204299.57017: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204299.57047: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204299.57055: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204299.57058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204299.57145: Set connection var ansible_connection to ssh 49915 1727204299.57149: Set connection var ansible_shell_type to sh 49915 1727204299.57154: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204299.57169: Set connection var ansible_shell_executable to /bin/sh 49915 1727204299.57180: Set connection var ansible_timeout to 10 49915 1727204299.57182: Set connection var ansible_pipelining to False 49915 1727204299.57289: variable 'ansible_shell_executable' from source: unknown 49915 1727204299.57293: variable 'ansible_connection' from source: unknown 49915 1727204299.57295: variable 'ansible_module_compression' from source: unknown 49915 1727204299.57297: variable 'ansible_shell_type' from source: unknown 49915 1727204299.57299: variable 'ansible_shell_executable' from source: unknown 49915 1727204299.57301: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204299.57304: variable 'ansible_pipelining' from source: unknown 49915 1727204299.57306: variable 'ansible_timeout' from source: unknown 49915 1727204299.57308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204299.57355: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204299.57365: variable 'omit' from source: magic vars 49915 1727204299.57370: starting attempt loop 49915 1727204299.57373: running the handler 49915 1727204299.57423: handler run complete 49915 1727204299.57437: attempt loop complete, returning result 49915 1727204299.57440: _execute() done 49915 1727204299.57443: dumping result to json 49915 1727204299.57445: done dumping result, returning 49915 1727204299.57453: done running TaskExecutor() for managed-node2/TASK: Show current_interfaces [028d2410-947f-dcd7-b5af-000000000315] 49915 1727204299.57458: sending task result for task 028d2410-947f-dcd7-b5af-000000000315 49915 1727204299.57731: done sending task result for task 028d2410-947f-dcd7-b5af-000000000315 49915 1727204299.57734: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 49915 1727204299.57777: no more pending results, returning what we have 49915 1727204299.57780: results queue empty 49915 1727204299.57781: checking for any_errors_fatal 49915 1727204299.57786: done checking for any_errors_fatal 49915 1727204299.57786: checking for max_fail_percentage 49915 1727204299.57788: done checking for max_fail_percentage 49915 1727204299.57789: checking to see if all hosts have failed and the running result is not ok 49915 1727204299.57790: done checking to see if all hosts have failed 49915 1727204299.57790: getting the remaining hosts for this loop 49915 1727204299.57791: done getting the remaining hosts for this loop 49915 1727204299.57796: getting the next task for host managed-node2 49915 1727204299.57803: done getting next task for host managed-node2 49915 1727204299.57808: ^ task is: TASK: Install iproute 49915 1727204299.57811: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204299.57817: getting variables 49915 1727204299.57818: in VariableManager get_vars() 49915 1727204299.57853: Calling all_inventory to load vars for managed-node2 49915 1727204299.57856: Calling groups_inventory to load vars for managed-node2 49915 1727204299.57860: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204299.57870: Calling all_plugins_play to load vars for managed-node2 49915 1727204299.57872: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204299.57877: Calling groups_plugins_play to load vars for managed-node2 49915 1727204299.58056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204299.58252: done with get_vars() 49915 1727204299.58262: done getting variables 49915 1727204299.58321: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Tuesday 24 September 2024 14:58:19 -0400 (0:00:00.031) 0:00:06.289 ***** 49915 1727204299.58353: entering _queue_task() for managed-node2/package 49915 1727204299.58643: worker is 1 (out of 1 available) 49915 1727204299.58657: exiting _queue_task() for managed-node2/package 49915 1727204299.58670: done queuing things up, now waiting for results queue to drain 49915 1727204299.58671: waiting for pending results... 49915 1727204299.58942: running TaskExecutor() for managed-node2/TASK: Install iproute 49915 1727204299.59046: in run() - task 028d2410-947f-dcd7-b5af-00000000021e 49915 1727204299.59067: variable 'ansible_search_path' from source: unknown 49915 1727204299.59074: variable 'ansible_search_path' from source: unknown 49915 1727204299.59128: calling self._execute() 49915 1727204299.59223: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204299.59240: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204299.59253: variable 'omit' from source: magic vars 49915 1727204299.60084: variable 'ansible_distribution_major_version' from source: facts 49915 1727204299.60087: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204299.60088: variable 'omit' from source: magic vars 49915 1727204299.60091: variable 'omit' from source: magic vars 49915 1727204299.60130: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49915 1727204299.62646: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49915 1727204299.62726: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49915 1727204299.62768: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49915 1727204299.62816: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49915 1727204299.62862: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49915 1727204299.62970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204299.63006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204299.63045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204299.63151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204299.63171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204299.63287: variable '__network_is_ostree' from source: set_fact 49915 1727204299.63298: variable 'omit' from source: magic vars 49915 1727204299.63335: variable 'omit' from source: magic vars 49915 1727204299.63371: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204299.63404: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204299.63452: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204299.63455: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204299.63465: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204299.63499: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204299.63507: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204299.63560: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204299.63619: Set connection var ansible_connection to ssh 49915 1727204299.63627: Set connection var ansible_shell_type to sh 49915 1727204299.63638: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204299.63651: Set connection var ansible_shell_executable to /bin/sh 49915 1727204299.63659: Set connection var ansible_timeout to 10 49915 1727204299.63674: Set connection var ansible_pipelining to False 49915 1727204299.63705: variable 'ansible_shell_executable' from source: unknown 49915 1727204299.63712: variable 'ansible_connection' from source: unknown 49915 1727204299.63720: variable 'ansible_module_compression' from source: unknown 49915 1727204299.63727: variable 'ansible_shell_type' from source: unknown 49915 1727204299.63793: variable 'ansible_shell_executable' from source: unknown 49915 1727204299.63796: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204299.63799: variable 'ansible_pipelining' from source: unknown 49915 1727204299.63801: variable 'ansible_timeout' from source: unknown 49915 1727204299.63803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204299.63885: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204299.63900: variable 'omit' from source: magic vars 49915 1727204299.63912: starting attempt loop 49915 1727204299.63919: running the handler 49915 1727204299.63930: variable 'ansible_facts' from source: unknown 49915 1727204299.63994: variable 'ansible_facts' from source: unknown 49915 1727204299.63997: _low_level_execute_command(): starting 49915 1727204299.63999: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49915 1727204299.65122: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204299.65126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration <<< 49915 1727204299.65129: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 49915 1727204299.65131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204299.65247: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204299.65300: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204299.67043: stdout chunk (state=3): >>>/root <<< 49915 1727204299.67183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204299.67205: stdout chunk (state=3): >>><<< 49915 1727204299.67208: stderr chunk (state=3): >>><<< 49915 1727204299.67226: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204299.67330: _low_level_execute_command(): starting 49915 1727204299.67334: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204299.6723936-50559-117470776546124 `" && echo ansible-tmp-1727204299.6723936-50559-117470776546124="` echo /root/.ansible/tmp/ansible-tmp-1727204299.6723936-50559-117470776546124 `" ) && sleep 0' 49915 1727204299.68585: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204299.68589: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204299.68592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204299.68595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204299.68599: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 49915 1727204299.68602: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204299.68687: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204299.68691: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204299.68694: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204299.68895: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204299.70761: stdout chunk (state=3): >>>ansible-tmp-1727204299.6723936-50559-117470776546124=/root/.ansible/tmp/ansible-tmp-1727204299.6723936-50559-117470776546124 <<< 49915 1727204299.70868: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204299.70923: stderr chunk (state=3): >>><<< 49915 1727204299.70943: stdout chunk (state=3): >>><<< 49915 1727204299.70964: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204299.6723936-50559-117470776546124=/root/.ansible/tmp/ansible-tmp-1727204299.6723936-50559-117470776546124 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204299.71003: variable 'ansible_module_compression' from source: unknown 49915 1727204299.71078: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 49915 1727204299.71087: ANSIBALLZ: Acquiring lock 49915 1727204299.71093: ANSIBALLZ: Lock acquired: 140698012046288 49915 1727204299.71100: ANSIBALLZ: Creating module 49915 1727204299.90989: ANSIBALLZ: Writing module into payload 49915 1727204299.91201: ANSIBALLZ: Writing module 49915 1727204299.91237: ANSIBALLZ: Renaming module 49915 1727204299.91256: ANSIBALLZ: Done creating module 49915 1727204299.91292: variable 'ansible_facts' from source: unknown 49915 1727204299.91486: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204299.6723936-50559-117470776546124/AnsiballZ_dnf.py 49915 1727204299.91607: Sending initial data 49915 1727204299.91616: Sent initial data (152 bytes) 49915 1727204299.92400: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204299.92427: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204299.92446: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204299.92467: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204299.92590: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204299.94239: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49915 1727204299.94329: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49915 1727204299.94430: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49915ogiz3nec/tmp8voadd8e /root/.ansible/tmp/ansible-tmp-1727204299.6723936-50559-117470776546124/AnsiballZ_dnf.py <<< 49915 1727204299.94441: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204299.6723936-50559-117470776546124/AnsiballZ_dnf.py" <<< 49915 1727204299.94500: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49915ogiz3nec/tmp8voadd8e" to remote "/root/.ansible/tmp/ansible-tmp-1727204299.6723936-50559-117470776546124/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204299.6723936-50559-117470776546124/AnsiballZ_dnf.py" <<< 49915 1727204299.96391: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204299.96394: stderr chunk (state=3): >>><<< 49915 1727204299.96397: stdout chunk (state=3): >>><<< 49915 1727204299.96403: done transferring module to remote 49915 1727204299.96405: _low_level_execute_command(): starting 49915 1727204299.96408: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204299.6723936-50559-117470776546124/ /root/.ansible/tmp/ansible-tmp-1727204299.6723936-50559-117470776546124/AnsiballZ_dnf.py && sleep 0' 49915 1727204299.97389: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204299.97403: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204299.97420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204299.97445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204299.97461: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 49915 1727204299.97472: stderr chunk (state=3): >>>debug2: match not found <<< 49915 1727204299.97488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204299.97506: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 49915 1727204299.97519: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 49915 1727204299.97530: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 49915 1727204299.97585: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204299.97597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204299.97691: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204299.97707: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204299.97996: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204299.98101: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204299.99986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204299.99990: stdout chunk (state=3): >>><<< 49915 1727204299.99992: stderr chunk (state=3): >>><<< 49915 1727204300.00008: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204300.00017: _low_level_execute_command(): starting 49915 1727204300.00027: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204299.6723936-50559-117470776546124/AnsiballZ_dnf.py && sleep 0' 49915 1727204300.00725: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204300.00745: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204300.00759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204300.00858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204300.00861: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204300.00890: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204300.00908: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204300.01082: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204300.01198: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204300.42409: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 49915 1727204300.53569: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 49915 1727204300.53573: stdout chunk (state=3): >>><<< 49915 1727204300.53577: stderr chunk (state=3): >>><<< 49915 1727204300.53597: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 49915 1727204300.53729: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204299.6723936-50559-117470776546124/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49915 1727204300.53733: _low_level_execute_command(): starting 49915 1727204300.53736: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204299.6723936-50559-117470776546124/ > /dev/null 2>&1 && sleep 0' 49915 1727204300.54214: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204300.54223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 49915 1727204300.54237: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204300.54252: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204300.54263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204300.54272: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204300.54315: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204300.54328: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204300.54409: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204300.56337: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204300.56340: stdout chunk (state=3): >>><<< 49915 1727204300.56343: stderr chunk (state=3): >>><<< 49915 1727204300.56359: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204300.56370: handler run complete 49915 1727204300.56619: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49915 1727204300.56774: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49915 1727204300.56804: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49915 1727204300.56830: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49915 1727204300.56851: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49915 1727204300.56913: variable '__install_status' from source: unknown 49915 1727204300.56928: Evaluated conditional (__install_status is success): True 49915 1727204300.56940: attempt loop complete, returning result 49915 1727204300.56943: _execute() done 49915 1727204300.56945: dumping result to json 49915 1727204300.56950: done dumping result, returning 49915 1727204300.56957: done running TaskExecutor() for managed-node2/TASK: Install iproute [028d2410-947f-dcd7-b5af-00000000021e] 49915 1727204300.56961: sending task result for task 028d2410-947f-dcd7-b5af-00000000021e 49915 1727204300.57055: done sending task result for task 028d2410-947f-dcd7-b5af-00000000021e 49915 1727204300.57057: WORKER PROCESS EXITING ok: [managed-node2] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 49915 1727204300.57163: no more pending results, returning what we have 49915 1727204300.57166: results queue empty 49915 1727204300.57166: checking for any_errors_fatal 49915 1727204300.57170: done checking for any_errors_fatal 49915 1727204300.57171: checking for max_fail_percentage 49915 1727204300.57172: done checking for max_fail_percentage 49915 1727204300.57173: checking to see if all hosts have failed and the running result is not ok 49915 1727204300.57174: done checking to see if all hosts have failed 49915 1727204300.57177: getting the remaining hosts for this loop 49915 1727204300.57179: done getting the remaining hosts for this loop 49915 1727204300.57182: getting the next task for host managed-node2 49915 1727204300.57187: done getting next task for host managed-node2 49915 1727204300.57190: ^ task is: TASK: Create veth interface {{ interface }} 49915 1727204300.57192: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204300.57196: getting variables 49915 1727204300.57198: in VariableManager get_vars() 49915 1727204300.57236: Calling all_inventory to load vars for managed-node2 49915 1727204300.57239: Calling groups_inventory to load vars for managed-node2 49915 1727204300.57241: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204300.57250: Calling all_plugins_play to load vars for managed-node2 49915 1727204300.57253: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204300.57255: Calling groups_plugins_play to load vars for managed-node2 49915 1727204300.57434: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204300.57552: done with get_vars() 49915 1727204300.57559: done getting variables 49915 1727204300.57601: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49915 1727204300.57695: variable 'interface' from source: play vars TASK [Create veth interface lsr101] ******************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Tuesday 24 September 2024 14:58:20 -0400 (0:00:00.993) 0:00:07.283 ***** 49915 1727204300.57728: entering _queue_task() for managed-node2/command 49915 1727204300.57930: worker is 1 (out of 1 available) 49915 1727204300.57945: exiting _queue_task() for managed-node2/command 49915 1727204300.57958: done queuing things up, now waiting for results queue to drain 49915 1727204300.57959: waiting for pending results... 49915 1727204300.58112: running TaskExecutor() for managed-node2/TASK: Create veth interface lsr101 49915 1727204300.58173: in run() - task 028d2410-947f-dcd7-b5af-00000000021f 49915 1727204300.58185: variable 'ansible_search_path' from source: unknown 49915 1727204300.58197: variable 'ansible_search_path' from source: unknown 49915 1727204300.58386: variable 'interface' from source: play vars 49915 1727204300.58448: variable 'interface' from source: play vars 49915 1727204300.58500: variable 'interface' from source: play vars 49915 1727204300.58607: Loaded config def from plugin (lookup/items) 49915 1727204300.58614: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 49915 1727204300.58634: variable 'omit' from source: magic vars 49915 1727204300.58713: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204300.58723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204300.58736: variable 'omit' from source: magic vars 49915 1727204300.58911: variable 'ansible_distribution_major_version' from source: facts 49915 1727204300.58919: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204300.59109: variable 'type' from source: play vars 49915 1727204300.59113: variable 'state' from source: include params 49915 1727204300.59226: variable 'interface' from source: play vars 49915 1727204300.59229: variable 'current_interfaces' from source: set_fact 49915 1727204300.59232: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 49915 1727204300.59236: variable 'omit' from source: magic vars 49915 1727204300.59238: variable 'omit' from source: magic vars 49915 1727204300.59240: variable 'item' from source: unknown 49915 1727204300.59275: variable 'item' from source: unknown 49915 1727204300.59290: variable 'omit' from source: magic vars 49915 1727204300.59323: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204300.59352: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204300.59369: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204300.59387: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204300.59397: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204300.59473: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204300.59481: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204300.59485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204300.59585: Set connection var ansible_connection to ssh 49915 1727204300.59589: Set connection var ansible_shell_type to sh 49915 1727204300.59591: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204300.59594: Set connection var ansible_shell_executable to /bin/sh 49915 1727204300.59596: Set connection var ansible_timeout to 10 49915 1727204300.59599: Set connection var ansible_pipelining to False 49915 1727204300.59600: variable 'ansible_shell_executable' from source: unknown 49915 1727204300.59602: variable 'ansible_connection' from source: unknown 49915 1727204300.59604: variable 'ansible_module_compression' from source: unknown 49915 1727204300.59606: variable 'ansible_shell_type' from source: unknown 49915 1727204300.59608: variable 'ansible_shell_executable' from source: unknown 49915 1727204300.59611: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204300.59612: variable 'ansible_pipelining' from source: unknown 49915 1727204300.59614: variable 'ansible_timeout' from source: unknown 49915 1727204300.59616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204300.59803: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204300.59807: variable 'omit' from source: magic vars 49915 1727204300.59809: starting attempt loop 49915 1727204300.59812: running the handler 49915 1727204300.59814: _low_level_execute_command(): starting 49915 1727204300.59816: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49915 1727204300.60422: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204300.60434: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204300.60444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204300.60459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204300.60493: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204300.60506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204300.60562: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204300.60588: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204300.60666: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204300.62362: stdout chunk (state=3): >>>/root <<< 49915 1727204300.62497: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204300.62515: stderr chunk (state=3): >>><<< 49915 1727204300.62536: stdout chunk (state=3): >>><<< 49915 1727204300.62603: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204300.62617: _low_level_execute_command(): starting 49915 1727204300.62620: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204300.6256144-50605-262377790806991 `" && echo ansible-tmp-1727204300.6256144-50605-262377790806991="` echo /root/.ansible/tmp/ansible-tmp-1727204300.6256144-50605-262377790806991 `" ) && sleep 0' 49915 1727204300.63228: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204300.63231: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204300.63234: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204300.63236: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration <<< 49915 1727204300.63238: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204300.63240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204300.63294: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204300.63371: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204300.65300: stdout chunk (state=3): >>>ansible-tmp-1727204300.6256144-50605-262377790806991=/root/.ansible/tmp/ansible-tmp-1727204300.6256144-50605-262377790806991 <<< 49915 1727204300.65413: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204300.65438: stderr chunk (state=3): >>><<< 49915 1727204300.65441: stdout chunk (state=3): >>><<< 49915 1727204300.65456: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204300.6256144-50605-262377790806991=/root/.ansible/tmp/ansible-tmp-1727204300.6256144-50605-262377790806991 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204300.65482: variable 'ansible_module_compression' from source: unknown 49915 1727204300.65528: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49915ogiz3nec/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 49915 1727204300.65553: variable 'ansible_facts' from source: unknown 49915 1727204300.65609: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204300.6256144-50605-262377790806991/AnsiballZ_command.py 49915 1727204300.65706: Sending initial data 49915 1727204300.65709: Sent initial data (156 bytes) 49915 1727204300.66187: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204300.66202: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204300.66217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204300.66322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204300.66346: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204300.66445: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204300.68015: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 49915 1727204300.68026: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49915 1727204300.68083: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49915 1727204300.68152: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49915ogiz3nec/tmpjczvktbr /root/.ansible/tmp/ansible-tmp-1727204300.6256144-50605-262377790806991/AnsiballZ_command.py <<< 49915 1727204300.68155: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204300.6256144-50605-262377790806991/AnsiballZ_command.py" <<< 49915 1727204300.68219: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49915ogiz3nec/tmpjczvktbr" to remote "/root/.ansible/tmp/ansible-tmp-1727204300.6256144-50605-262377790806991/AnsiballZ_command.py" <<< 49915 1727204300.68225: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204300.6256144-50605-262377790806991/AnsiballZ_command.py" <<< 49915 1727204300.68852: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204300.68894: stderr chunk (state=3): >>><<< 49915 1727204300.68897: stdout chunk (state=3): >>><<< 49915 1727204300.68939: done transferring module to remote 49915 1727204300.68949: _low_level_execute_command(): starting 49915 1727204300.68952: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204300.6256144-50605-262377790806991/ /root/.ansible/tmp/ansible-tmp-1727204300.6256144-50605-262377790806991/AnsiballZ_command.py && sleep 0' 49915 1727204300.69684: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration <<< 49915 1727204300.69714: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204300.69790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204300.69793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204300.69891: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204300.70333: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204300.70398: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204300.72368: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204300.72371: stdout chunk (state=3): >>><<< 49915 1727204300.72373: stderr chunk (state=3): >>><<< 49915 1727204300.72377: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204300.72380: _low_level_execute_command(): starting 49915 1727204300.72382: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204300.6256144-50605-262377790806991/AnsiballZ_command.py && sleep 0' 49915 1727204300.72991: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204300.73029: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204300.73052: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204300.73062: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204300.73181: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204300.89179: stdout chunk (state=3): >>> <<< 49915 1727204300.89219: stdout chunk (state=3): >>>{"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "lsr101", "type", "veth", "peer", "name", "peerlsr101"], "start": "2024-09-24 14:58:20.881508", "end": "2024-09-24 14:58:20.888774", "delta": "0:00:00.007266", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add lsr101 type veth peer name peerlsr101", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 49915 1727204300.91683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 49915 1727204300.91687: stdout chunk (state=3): >>><<< 49915 1727204300.91689: stderr chunk (state=3): >>><<< 49915 1727204300.91710: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "lsr101", "type", "veth", "peer", "name", "peerlsr101"], "start": "2024-09-24 14:58:20.881508", "end": "2024-09-24 14:58:20.888774", "delta": "0:00:00.007266", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add lsr101 type veth peer name peerlsr101", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 49915 1727204300.91758: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add lsr101 type veth peer name peerlsr101', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204300.6256144-50605-262377790806991/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49915 1727204300.91853: _low_level_execute_command(): starting 49915 1727204300.91857: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204300.6256144-50605-262377790806991/ > /dev/null 2>&1 && sleep 0' 49915 1727204300.92580: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204300.92691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204300.92722: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204300.92983: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204300.97009: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204300.97034: stderr chunk (state=3): >>><<< 49915 1727204300.97037: stdout chunk (state=3): >>><<< 49915 1727204300.97070: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204300.97073: handler run complete 49915 1727204300.97096: Evaluated conditional (False): False 49915 1727204300.97181: attempt loop complete, returning result 49915 1727204300.97184: variable 'item' from source: unknown 49915 1727204300.97231: variable 'item' from source: unknown ok: [managed-node2] => (item=ip link add lsr101 type veth peer name peerlsr101) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "lsr101", "type", "veth", "peer", "name", "peerlsr101" ], "delta": "0:00:00.007266", "end": "2024-09-24 14:58:20.888774", "item": "ip link add lsr101 type veth peer name peerlsr101", "rc": 0, "start": "2024-09-24 14:58:20.881508" } 49915 1727204300.97634: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204300.97637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204300.97639: variable 'omit' from source: magic vars 49915 1727204300.97753: variable 'ansible_distribution_major_version' from source: facts 49915 1727204300.97756: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204300.97959: variable 'type' from source: play vars 49915 1727204300.98065: variable 'state' from source: include params 49915 1727204300.98070: variable 'interface' from source: play vars 49915 1727204300.98073: variable 'current_interfaces' from source: set_fact 49915 1727204300.98077: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 49915 1727204300.98079: variable 'omit' from source: magic vars 49915 1727204300.98081: variable 'omit' from source: magic vars 49915 1727204300.98083: variable 'item' from source: unknown 49915 1727204300.98178: variable 'item' from source: unknown 49915 1727204300.98182: variable 'omit' from source: magic vars 49915 1727204300.98184: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204300.98186: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204300.98188: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204300.98221: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204300.98228: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204300.98235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204300.98331: Set connection var ansible_connection to ssh 49915 1727204300.98339: Set connection var ansible_shell_type to sh 49915 1727204300.98356: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204300.98371: Set connection var ansible_shell_executable to /bin/sh 49915 1727204300.98427: Set connection var ansible_timeout to 10 49915 1727204300.98430: Set connection var ansible_pipelining to False 49915 1727204300.98445: variable 'ansible_shell_executable' from source: unknown 49915 1727204300.98453: variable 'ansible_connection' from source: unknown 49915 1727204300.98499: variable 'ansible_module_compression' from source: unknown 49915 1727204300.98502: variable 'ansible_shell_type' from source: unknown 49915 1727204300.98505: variable 'ansible_shell_executable' from source: unknown 49915 1727204300.98507: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204300.98509: variable 'ansible_pipelining' from source: unknown 49915 1727204300.98511: variable 'ansible_timeout' from source: unknown 49915 1727204300.98518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204300.98672: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204300.98744: variable 'omit' from source: magic vars 49915 1727204300.98747: starting attempt loop 49915 1727204300.98750: running the handler 49915 1727204300.98752: _low_level_execute_command(): starting 49915 1727204300.98754: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49915 1727204300.99895: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204301.00021: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204301.00041: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204301.00197: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204301.01859: stdout chunk (state=3): >>>/root <<< 49915 1727204301.01982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204301.01986: stdout chunk (state=3): >>><<< 49915 1727204301.01989: stderr chunk (state=3): >>><<< 49915 1727204301.02127: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204301.02131: _low_level_execute_command(): starting 49915 1727204301.02134: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204301.0201538-50605-168077506261977 `" && echo ansible-tmp-1727204301.0201538-50605-168077506261977="` echo /root/.ansible/tmp/ansible-tmp-1727204301.0201538-50605-168077506261977 `" ) && sleep 0' 49915 1727204301.02989: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204301.03008: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204301.03191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204301.03266: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204301.03288: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204301.03562: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204301.05447: stdout chunk (state=3): >>>ansible-tmp-1727204301.0201538-50605-168077506261977=/root/.ansible/tmp/ansible-tmp-1727204301.0201538-50605-168077506261977 <<< 49915 1727204301.05564: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204301.05625: stderr chunk (state=3): >>><<< 49915 1727204301.05628: stdout chunk (state=3): >>><<< 49915 1727204301.05799: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204301.0201538-50605-168077506261977=/root/.ansible/tmp/ansible-tmp-1727204301.0201538-50605-168077506261977 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204301.05802: variable 'ansible_module_compression' from source: unknown 49915 1727204301.05804: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49915ogiz3nec/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 49915 1727204301.05806: variable 'ansible_facts' from source: unknown 49915 1727204301.05833: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204301.0201538-50605-168077506261977/AnsiballZ_command.py 49915 1727204301.06001: Sending initial data 49915 1727204301.06025: Sent initial data (156 bytes) 49915 1727204301.06773: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204301.06826: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204301.06856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204301.06927: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204301.07009: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204301.07034: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204301.07134: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204301.08725: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 49915 1727204301.08753: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49915 1727204301.08820: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49915 1727204301.08897: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49915ogiz3nec/tmp0haf3r6m /root/.ansible/tmp/ansible-tmp-1727204301.0201538-50605-168077506261977/AnsiballZ_command.py <<< 49915 1727204301.08901: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204301.0201538-50605-168077506261977/AnsiballZ_command.py" <<< 49915 1727204301.08974: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49915ogiz3nec/tmp0haf3r6m" to remote "/root/.ansible/tmp/ansible-tmp-1727204301.0201538-50605-168077506261977/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204301.0201538-50605-168077506261977/AnsiballZ_command.py" <<< 49915 1727204301.09981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204301.10089: stderr chunk (state=3): >>><<< 49915 1727204301.10092: stdout chunk (state=3): >>><<< 49915 1727204301.10095: done transferring module to remote 49915 1727204301.10097: _low_level_execute_command(): starting 49915 1727204301.10099: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204301.0201538-50605-168077506261977/ /root/.ansible/tmp/ansible-tmp-1727204301.0201538-50605-168077506261977/AnsiballZ_command.py && sleep 0' 49915 1727204301.11039: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204301.11062: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204301.11157: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204301.13001: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204301.13005: stderr chunk (state=3): >>><<< 49915 1727204301.13008: stdout chunk (state=3): >>><<< 49915 1727204301.13069: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204301.13077: _low_level_execute_command(): starting 49915 1727204301.13080: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204301.0201538-50605-168077506261977/AnsiballZ_command.py && sleep 0' 49915 1727204301.13657: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204301.13677: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204301.13691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204301.13761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204301.13802: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204301.13808: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204301.13892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204301.29509: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerlsr101", "up"], "start": "2024-09-24 14:58:21.289811", "end": "2024-09-24 14:58:21.293595", "delta": "0:00:00.003784", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerlsr101 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 49915 1727204301.30991: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 49915 1727204301.31027: stderr chunk (state=3): >>><<< 49915 1727204301.31030: stdout chunk (state=3): >>><<< 49915 1727204301.31047: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerlsr101", "up"], "start": "2024-09-24 14:58:21.289811", "end": "2024-09-24 14:58:21.293595", "delta": "0:00:00.003784", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerlsr101 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 49915 1727204301.31079: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerlsr101 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204301.0201538-50605-168077506261977/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49915 1727204301.31084: _low_level_execute_command(): starting 49915 1727204301.31089: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204301.0201538-50605-168077506261977/ > /dev/null 2>&1 && sleep 0' 49915 1727204301.31539: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204301.31543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204301.31572: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 49915 1727204301.31575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 49915 1727204301.31579: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204301.31581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204301.31638: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204301.31642: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204301.31644: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204301.31727: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204301.33549: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204301.33577: stderr chunk (state=3): >>><<< 49915 1727204301.33581: stdout chunk (state=3): >>><<< 49915 1727204301.33593: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204301.33598: handler run complete 49915 1727204301.33615: Evaluated conditional (False): False 49915 1727204301.33621: attempt loop complete, returning result 49915 1727204301.33636: variable 'item' from source: unknown 49915 1727204301.33702: variable 'item' from source: unknown ok: [managed-node2] => (item=ip link set peerlsr101 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerlsr101", "up" ], "delta": "0:00:00.003784", "end": "2024-09-24 14:58:21.293595", "item": "ip link set peerlsr101 up", "rc": 0, "start": "2024-09-24 14:58:21.289811" } 49915 1727204301.33821: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204301.33823: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204301.33826: variable 'omit' from source: magic vars 49915 1727204301.33923: variable 'ansible_distribution_major_version' from source: facts 49915 1727204301.33927: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204301.34065: variable 'type' from source: play vars 49915 1727204301.34069: variable 'state' from source: include params 49915 1727204301.34071: variable 'interface' from source: play vars 49915 1727204301.34074: variable 'current_interfaces' from source: set_fact 49915 1727204301.34093: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 49915 1727204301.34095: variable 'omit' from source: magic vars 49915 1727204301.34100: variable 'omit' from source: magic vars 49915 1727204301.34129: variable 'item' from source: unknown 49915 1727204301.34198: variable 'item' from source: unknown 49915 1727204301.34232: variable 'omit' from source: magic vars 49915 1727204301.34257: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204301.34260: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204301.34262: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204301.34272: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204301.34281: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204301.34284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204301.34342: Set connection var ansible_connection to ssh 49915 1727204301.34345: Set connection var ansible_shell_type to sh 49915 1727204301.34349: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204301.34357: Set connection var ansible_shell_executable to /bin/sh 49915 1727204301.34361: Set connection var ansible_timeout to 10 49915 1727204301.34367: Set connection var ansible_pipelining to False 49915 1727204301.34401: variable 'ansible_shell_executable' from source: unknown 49915 1727204301.34404: variable 'ansible_connection' from source: unknown 49915 1727204301.34410: variable 'ansible_module_compression' from source: unknown 49915 1727204301.34415: variable 'ansible_shell_type' from source: unknown 49915 1727204301.34417: variable 'ansible_shell_executable' from source: unknown 49915 1727204301.34419: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204301.34421: variable 'ansible_pipelining' from source: unknown 49915 1727204301.34423: variable 'ansible_timeout' from source: unknown 49915 1727204301.34424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204301.34514: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204301.34524: variable 'omit' from source: magic vars 49915 1727204301.34527: starting attempt loop 49915 1727204301.34530: running the handler 49915 1727204301.34555: _low_level_execute_command(): starting 49915 1727204301.34558: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49915 1727204301.35024: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204301.35028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204301.35030: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 49915 1727204301.35032: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204301.35034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204301.35091: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204301.35098: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204301.35170: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204301.36765: stdout chunk (state=3): >>>/root <<< 49915 1727204301.36881: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204301.36904: stderr chunk (state=3): >>><<< 49915 1727204301.36907: stdout chunk (state=3): >>><<< 49915 1727204301.36922: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204301.36930: _low_level_execute_command(): starting 49915 1727204301.36934: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204301.3692179-50605-204888198836196 `" && echo ansible-tmp-1727204301.3692179-50605-204888198836196="` echo /root/.ansible/tmp/ansible-tmp-1727204301.3692179-50605-204888198836196 `" ) && sleep 0' 49915 1727204301.37417: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204301.37420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 49915 1727204301.37422: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204301.37424: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204301.37426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 49915 1727204301.37428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204301.37477: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204301.37481: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204301.37560: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204301.39495: stdout chunk (state=3): >>>ansible-tmp-1727204301.3692179-50605-204888198836196=/root/.ansible/tmp/ansible-tmp-1727204301.3692179-50605-204888198836196 <<< 49915 1727204301.39607: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204301.39655: stdout chunk (state=3): >>><<< 49915 1727204301.39659: stderr chunk (state=3): >>><<< 49915 1727204301.39907: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204301.3692179-50605-204888198836196=/root/.ansible/tmp/ansible-tmp-1727204301.3692179-50605-204888198836196 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204301.39910: variable 'ansible_module_compression' from source: unknown 49915 1727204301.39912: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49915ogiz3nec/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 49915 1727204301.39914: variable 'ansible_facts' from source: unknown 49915 1727204301.39916: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204301.3692179-50605-204888198836196/AnsiballZ_command.py 49915 1727204301.40170: Sending initial data 49915 1727204301.40272: Sent initial data (156 bytes) 49915 1727204301.41087: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204301.41094: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204301.41097: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204301.41105: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204301.41329: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204301.43112: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49915 1727204301.43183: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49915 1727204301.43250: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49915ogiz3nec/tmp0olg1red /root/.ansible/tmp/ansible-tmp-1727204301.3692179-50605-204888198836196/AnsiballZ_command.py <<< 49915 1727204301.43253: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204301.3692179-50605-204888198836196/AnsiballZ_command.py" <<< 49915 1727204301.43337: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49915ogiz3nec/tmp0olg1red" to remote "/root/.ansible/tmp/ansible-tmp-1727204301.3692179-50605-204888198836196/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204301.3692179-50605-204888198836196/AnsiballZ_command.py" <<< 49915 1727204301.44495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204301.44498: stderr chunk (state=3): >>><<< 49915 1727204301.44503: stdout chunk (state=3): >>><<< 49915 1727204301.44554: done transferring module to remote 49915 1727204301.44561: _low_level_execute_command(): starting 49915 1727204301.44566: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204301.3692179-50605-204888198836196/ /root/.ansible/tmp/ansible-tmp-1727204301.3692179-50605-204888198836196/AnsiballZ_command.py && sleep 0' 49915 1727204301.45355: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204301.45369: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204301.45685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204301.45750: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204301.45914: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204301.47853: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204301.47856: stdout chunk (state=3): >>><<< 49915 1727204301.47859: stderr chunk (state=3): >>><<< 49915 1727204301.47936: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204301.47940: _low_level_execute_command(): starting 49915 1727204301.47942: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204301.3692179-50605-204888198836196/AnsiballZ_command.py && sleep 0' 49915 1727204301.49127: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 49915 1727204301.49192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204301.49260: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204301.49458: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204301.49481: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204301.49506: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204301.49657: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204301.65383: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "lsr101", "up"], "start": "2024-09-24 14:58:21.648215", "end": "2024-09-24 14:58:21.652055", "delta": "0:00:00.003840", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set lsr101 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 49915 1727204301.67048: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 49915 1727204301.67056: stdout chunk (state=3): >>><<< 49915 1727204301.67059: stderr chunk (state=3): >>><<< 49915 1727204301.67072: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "lsr101", "up"], "start": "2024-09-24 14:58:21.648215", "end": "2024-09-24 14:58:21.652055", "delta": "0:00:00.003840", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set lsr101 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 49915 1727204301.67095: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set lsr101 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204301.3692179-50605-204888198836196/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49915 1727204301.67100: _low_level_execute_command(): starting 49915 1727204301.67106: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204301.3692179-50605-204888198836196/ > /dev/null 2>&1 && sleep 0' 49915 1727204301.67778: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204301.67782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204301.67785: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204301.67787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204301.67790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204301.67824: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204301.67895: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204301.69782: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204301.69799: stdout chunk (state=3): >>><<< 49915 1727204301.69801: stderr chunk (state=3): >>><<< 49915 1727204301.69811: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204301.69849: handler run complete 49915 1727204301.69854: Evaluated conditional (False): False 49915 1727204301.69855: attempt loop complete, returning result 49915 1727204301.69891: variable 'item' from source: unknown 49915 1727204301.69993: variable 'item' from source: unknown ok: [managed-node2] => (item=ip link set lsr101 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "lsr101", "up" ], "delta": "0:00:00.003840", "end": "2024-09-24 14:58:21.652055", "item": "ip link set lsr101 up", "rc": 0, "start": "2024-09-24 14:58:21.648215" } 49915 1727204301.70142: dumping result to json 49915 1727204301.70145: done dumping result, returning 49915 1727204301.70147: done running TaskExecutor() for managed-node2/TASK: Create veth interface lsr101 [028d2410-947f-dcd7-b5af-00000000021f] 49915 1727204301.70148: sending task result for task 028d2410-947f-dcd7-b5af-00000000021f 49915 1727204301.70220: done sending task result for task 028d2410-947f-dcd7-b5af-00000000021f 49915 1727204301.70225: WORKER PROCESS EXITING 49915 1727204301.70409: no more pending results, returning what we have 49915 1727204301.70412: results queue empty 49915 1727204301.70413: checking for any_errors_fatal 49915 1727204301.70418: done checking for any_errors_fatal 49915 1727204301.70419: checking for max_fail_percentage 49915 1727204301.70420: done checking for max_fail_percentage 49915 1727204301.70421: checking to see if all hosts have failed and the running result is not ok 49915 1727204301.70422: done checking to see if all hosts have failed 49915 1727204301.70422: getting the remaining hosts for this loop 49915 1727204301.70424: done getting the remaining hosts for this loop 49915 1727204301.70427: getting the next task for host managed-node2 49915 1727204301.70432: done getting next task for host managed-node2 49915 1727204301.70434: ^ task is: TASK: Set up veth as managed by NetworkManager 49915 1727204301.70436: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204301.70443: getting variables 49915 1727204301.70445: in VariableManager get_vars() 49915 1727204301.70502: Calling all_inventory to load vars for managed-node2 49915 1727204301.70505: Calling groups_inventory to load vars for managed-node2 49915 1727204301.70506: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204301.70518: Calling all_plugins_play to load vars for managed-node2 49915 1727204301.70520: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204301.70522: Calling groups_plugins_play to load vars for managed-node2 49915 1727204301.70702: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204301.70849: done with get_vars() 49915 1727204301.70857: done getting variables 49915 1727204301.70901: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Tuesday 24 September 2024 14:58:21 -0400 (0:00:01.131) 0:00:08.415 ***** 49915 1727204301.70921: entering _queue_task() for managed-node2/command 49915 1727204301.71178: worker is 1 (out of 1 available) 49915 1727204301.71192: exiting _queue_task() for managed-node2/command 49915 1727204301.71205: done queuing things up, now waiting for results queue to drain 49915 1727204301.71207: waiting for pending results... 49915 1727204301.71393: running TaskExecutor() for managed-node2/TASK: Set up veth as managed by NetworkManager 49915 1727204301.71489: in run() - task 028d2410-947f-dcd7-b5af-000000000220 49915 1727204301.71498: variable 'ansible_search_path' from source: unknown 49915 1727204301.71502: variable 'ansible_search_path' from source: unknown 49915 1727204301.71632: calling self._execute() 49915 1727204301.71881: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204301.71885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204301.71888: variable 'omit' from source: magic vars 49915 1727204301.72032: variable 'ansible_distribution_major_version' from source: facts 49915 1727204301.72043: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204301.72195: variable 'type' from source: play vars 49915 1727204301.72199: variable 'state' from source: include params 49915 1727204301.72206: Evaluated conditional (type == 'veth' and state == 'present'): True 49915 1727204301.72212: variable 'omit' from source: magic vars 49915 1727204301.72249: variable 'omit' from source: magic vars 49915 1727204301.72348: variable 'interface' from source: play vars 49915 1727204301.72365: variable 'omit' from source: magic vars 49915 1727204301.72408: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204301.72443: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204301.72462: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204301.72479: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204301.72491: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204301.72523: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204301.72526: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204301.72529: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204301.72623: Set connection var ansible_connection to ssh 49915 1727204301.72626: Set connection var ansible_shell_type to sh 49915 1727204301.72632: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204301.72642: Set connection var ansible_shell_executable to /bin/sh 49915 1727204301.72647: Set connection var ansible_timeout to 10 49915 1727204301.72656: Set connection var ansible_pipelining to False 49915 1727204301.72677: variable 'ansible_shell_executable' from source: unknown 49915 1727204301.72680: variable 'ansible_connection' from source: unknown 49915 1727204301.72683: variable 'ansible_module_compression' from source: unknown 49915 1727204301.72685: variable 'ansible_shell_type' from source: unknown 49915 1727204301.72688: variable 'ansible_shell_executable' from source: unknown 49915 1727204301.72690: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204301.72694: variable 'ansible_pipelining' from source: unknown 49915 1727204301.72697: variable 'ansible_timeout' from source: unknown 49915 1727204301.72701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204301.72838: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204301.72848: variable 'omit' from source: magic vars 49915 1727204301.72851: starting attempt loop 49915 1727204301.72854: running the handler 49915 1727204301.72874: _low_level_execute_command(): starting 49915 1727204301.72879: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49915 1727204301.73460: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204301.73501: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204301.73505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204301.73508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204301.73514: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204301.73516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204301.73561: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204301.73564: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204301.73573: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204301.73656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204301.75562: stdout chunk (state=3): >>>/root <<< 49915 1727204301.75568: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204301.75571: stdout chunk (state=3): >>><<< 49915 1727204301.75573: stderr chunk (state=3): >>><<< 49915 1727204301.75603: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204301.75616: _low_level_execute_command(): starting 49915 1727204301.75619: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204301.7560017-50674-52488163263576 `" && echo ansible-tmp-1727204301.7560017-50674-52488163263576="` echo /root/.ansible/tmp/ansible-tmp-1727204301.7560017-50674-52488163263576 `" ) && sleep 0' 49915 1727204301.76288: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204301.76336: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204301.76349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204301.76395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204301.76398: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204301.76401: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 49915 1727204301.76458: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204301.76464: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 49915 1727204301.76470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204301.76622: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204301.76625: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204301.76666: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204301.76804: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204301.78733: stdout chunk (state=3): >>>ansible-tmp-1727204301.7560017-50674-52488163263576=/root/.ansible/tmp/ansible-tmp-1727204301.7560017-50674-52488163263576 <<< 49915 1727204301.78907: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204301.78911: stdout chunk (state=3): >>><<< 49915 1727204301.78947: stderr chunk (state=3): >>><<< 49915 1727204301.78954: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204301.7560017-50674-52488163263576=/root/.ansible/tmp/ansible-tmp-1727204301.7560017-50674-52488163263576 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204301.78979: variable 'ansible_module_compression' from source: unknown 49915 1727204301.79042: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49915ogiz3nec/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 49915 1727204301.79088: variable 'ansible_facts' from source: unknown 49915 1727204301.79294: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204301.7560017-50674-52488163263576/AnsiballZ_command.py 49915 1727204301.79339: Sending initial data 49915 1727204301.79342: Sent initial data (155 bytes) 49915 1727204301.79995: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204301.80001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204301.80017: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204301.80036: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204301.80039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204301.80096: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204301.80100: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204301.80102: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204301.80181: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204301.81749: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49915 1727204301.81821: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49915 1727204301.81901: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49915ogiz3nec/tmpmo0smenz /root/.ansible/tmp/ansible-tmp-1727204301.7560017-50674-52488163263576/AnsiballZ_command.py <<< 49915 1727204301.81904: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204301.7560017-50674-52488163263576/AnsiballZ_command.py" <<< 49915 1727204301.81973: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49915ogiz3nec/tmpmo0smenz" to remote "/root/.ansible/tmp/ansible-tmp-1727204301.7560017-50674-52488163263576/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204301.7560017-50674-52488163263576/AnsiballZ_command.py" <<< 49915 1727204301.82706: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204301.82852: stderr chunk (state=3): >>><<< 49915 1727204301.82858: stdout chunk (state=3): >>><<< 49915 1727204301.82860: done transferring module to remote 49915 1727204301.82862: _low_level_execute_command(): starting 49915 1727204301.82864: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204301.7560017-50674-52488163263576/ /root/.ansible/tmp/ansible-tmp-1727204301.7560017-50674-52488163263576/AnsiballZ_command.py && sleep 0' 49915 1727204301.83356: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204301.83363: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204301.83367: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204301.83369: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 49915 1727204301.83371: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204301.83373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204301.83419: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204301.83423: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204301.83504: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204301.85321: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204301.85348: stderr chunk (state=3): >>><<< 49915 1727204301.85351: stdout chunk (state=3): >>><<< 49915 1727204301.85364: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204301.85367: _low_level_execute_command(): starting 49915 1727204301.85372: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204301.7560017-50674-52488163263576/AnsiballZ_command.py && sleep 0' 49915 1727204301.85767: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204301.85801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204301.85804: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204301.85807: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204301.85809: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204301.85861: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204301.85867: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204301.85940: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204302.03133: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "lsr101", "managed", "true"], "start": "2024-09-24 14:58:22.009766", "end": "2024-09-24 14:58:22.029744", "delta": "0:00:00.019978", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set lsr101 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 49915 1727204302.04889: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 49915 1727204302.04892: stdout chunk (state=3): >>><<< 49915 1727204302.04895: stderr chunk (state=3): >>><<< 49915 1727204302.04897: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "lsr101", "managed", "true"], "start": "2024-09-24 14:58:22.009766", "end": "2024-09-24 14:58:22.029744", "delta": "0:00:00.019978", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set lsr101 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 49915 1727204302.04900: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set lsr101 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204301.7560017-50674-52488163263576/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49915 1727204302.04908: _low_level_execute_command(): starting 49915 1727204302.04910: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204301.7560017-50674-52488163263576/ > /dev/null 2>&1 && sleep 0' 49915 1727204302.05471: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204302.05489: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204302.05503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204302.05522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204302.05545: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 49915 1727204302.05645: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204302.05668: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204302.05775: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204302.07668: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204302.07680: stdout chunk (state=3): >>><<< 49915 1727204302.07691: stderr chunk (state=3): >>><<< 49915 1727204302.07715: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204302.07734: handler run complete 49915 1727204302.07760: Evaluated conditional (False): False 49915 1727204302.07774: attempt loop complete, returning result 49915 1727204302.07783: _execute() done 49915 1727204302.07789: dumping result to json 49915 1727204302.07798: done dumping result, returning 49915 1727204302.07813: done running TaskExecutor() for managed-node2/TASK: Set up veth as managed by NetworkManager [028d2410-947f-dcd7-b5af-000000000220] 49915 1727204302.07822: sending task result for task 028d2410-947f-dcd7-b5af-000000000220 ok: [managed-node2] => { "changed": false, "cmd": [ "nmcli", "d", "set", "lsr101", "managed", "true" ], "delta": "0:00:00.019978", "end": "2024-09-24 14:58:22.029744", "rc": 0, "start": "2024-09-24 14:58:22.009766" } 49915 1727204302.08042: no more pending results, returning what we have 49915 1727204302.08045: results queue empty 49915 1727204302.08046: checking for any_errors_fatal 49915 1727204302.08056: done checking for any_errors_fatal 49915 1727204302.08057: checking for max_fail_percentage 49915 1727204302.08059: done checking for max_fail_percentage 49915 1727204302.08060: checking to see if all hosts have failed and the running result is not ok 49915 1727204302.08061: done checking to see if all hosts have failed 49915 1727204302.08062: getting the remaining hosts for this loop 49915 1727204302.08063: done getting the remaining hosts for this loop 49915 1727204302.08068: getting the next task for host managed-node2 49915 1727204302.08074: done getting next task for host managed-node2 49915 1727204302.08191: ^ task is: TASK: Delete veth interface {{ interface }} 49915 1727204302.08195: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204302.08199: getting variables 49915 1727204302.08201: in VariableManager get_vars() 49915 1727204302.08243: Calling all_inventory to load vars for managed-node2 49915 1727204302.08245: Calling groups_inventory to load vars for managed-node2 49915 1727204302.08248: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204302.08259: Calling all_plugins_play to load vars for managed-node2 49915 1727204302.08261: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204302.08264: Calling groups_plugins_play to load vars for managed-node2 49915 1727204302.08644: done sending task result for task 028d2410-947f-dcd7-b5af-000000000220 49915 1727204302.08647: WORKER PROCESS EXITING 49915 1727204302.08670: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204302.08879: done with get_vars() 49915 1727204302.08890: done getting variables 49915 1727204302.08954: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49915 1727204302.09065: variable 'interface' from source: play vars TASK [Delete veth interface lsr101] ******************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Tuesday 24 September 2024 14:58:22 -0400 (0:00:00.381) 0:00:08.797 ***** 49915 1727204302.09092: entering _queue_task() for managed-node2/command 49915 1727204302.09595: worker is 1 (out of 1 available) 49915 1727204302.09605: exiting _queue_task() for managed-node2/command 49915 1727204302.09615: done queuing things up, now waiting for results queue to drain 49915 1727204302.09617: waiting for pending results... 49915 1727204302.09674: running TaskExecutor() for managed-node2/TASK: Delete veth interface lsr101 49915 1727204302.09796: in run() - task 028d2410-947f-dcd7-b5af-000000000221 49915 1727204302.09817: variable 'ansible_search_path' from source: unknown 49915 1727204302.09826: variable 'ansible_search_path' from source: unknown 49915 1727204302.09878: calling self._execute() 49915 1727204302.09971: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204302.09986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204302.10000: variable 'omit' from source: magic vars 49915 1727204302.10366: variable 'ansible_distribution_major_version' from source: facts 49915 1727204302.10392: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204302.10601: variable 'type' from source: play vars 49915 1727204302.10612: variable 'state' from source: include params 49915 1727204302.10620: variable 'interface' from source: play vars 49915 1727204302.10626: variable 'current_interfaces' from source: set_fact 49915 1727204302.10636: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 49915 1727204302.10642: when evaluation is False, skipping this task 49915 1727204302.10647: _execute() done 49915 1727204302.10651: dumping result to json 49915 1727204302.10657: done dumping result, returning 49915 1727204302.10664: done running TaskExecutor() for managed-node2/TASK: Delete veth interface lsr101 [028d2410-947f-dcd7-b5af-000000000221] 49915 1727204302.10672: sending task result for task 028d2410-947f-dcd7-b5af-000000000221 skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 49915 1727204302.10866: no more pending results, returning what we have 49915 1727204302.10870: results queue empty 49915 1727204302.10871: checking for any_errors_fatal 49915 1727204302.10883: done checking for any_errors_fatal 49915 1727204302.10884: checking for max_fail_percentage 49915 1727204302.10886: done checking for max_fail_percentage 49915 1727204302.10887: checking to see if all hosts have failed and the running result is not ok 49915 1727204302.10888: done checking to see if all hosts have failed 49915 1727204302.10889: getting the remaining hosts for this loop 49915 1727204302.10890: done getting the remaining hosts for this loop 49915 1727204302.10896: getting the next task for host managed-node2 49915 1727204302.10902: done getting next task for host managed-node2 49915 1727204302.10905: ^ task is: TASK: Create dummy interface {{ interface }} 49915 1727204302.10908: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204302.10913: getting variables 49915 1727204302.10915: in VariableManager get_vars() 49915 1727204302.10962: Calling all_inventory to load vars for managed-node2 49915 1727204302.10966: Calling groups_inventory to load vars for managed-node2 49915 1727204302.10968: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204302.11039: Calling all_plugins_play to load vars for managed-node2 49915 1727204302.11043: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204302.11048: done sending task result for task 028d2410-947f-dcd7-b5af-000000000221 49915 1727204302.11051: WORKER PROCESS EXITING 49915 1727204302.11054: Calling groups_plugins_play to load vars for managed-node2 49915 1727204302.11343: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204302.11631: done with get_vars() 49915 1727204302.11640: done getting variables 49915 1727204302.11709: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49915 1727204302.11825: variable 'interface' from source: play vars TASK [Create dummy interface lsr101] ******************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Tuesday 24 September 2024 14:58:22 -0400 (0:00:00.027) 0:00:08.824 ***** 49915 1727204302.11852: entering _queue_task() for managed-node2/command 49915 1727204302.12286: worker is 1 (out of 1 available) 49915 1727204302.12298: exiting _queue_task() for managed-node2/command 49915 1727204302.12309: done queuing things up, now waiting for results queue to drain 49915 1727204302.12311: waiting for pending results... 49915 1727204302.12428: running TaskExecutor() for managed-node2/TASK: Create dummy interface lsr101 49915 1727204302.12533: in run() - task 028d2410-947f-dcd7-b5af-000000000222 49915 1727204302.12556: variable 'ansible_search_path' from source: unknown 49915 1727204302.12563: variable 'ansible_search_path' from source: unknown 49915 1727204302.12607: calling self._execute() 49915 1727204302.12706: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204302.12718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204302.12755: variable 'omit' from source: magic vars 49915 1727204302.13102: variable 'ansible_distribution_major_version' from source: facts 49915 1727204302.13120: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204302.13380: variable 'type' from source: play vars 49915 1727204302.13383: variable 'state' from source: include params 49915 1727204302.13386: variable 'interface' from source: play vars 49915 1727204302.13388: variable 'current_interfaces' from source: set_fact 49915 1727204302.13390: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 49915 1727204302.13392: when evaluation is False, skipping this task 49915 1727204302.13394: _execute() done 49915 1727204302.13396: dumping result to json 49915 1727204302.13398: done dumping result, returning 49915 1727204302.13400: done running TaskExecutor() for managed-node2/TASK: Create dummy interface lsr101 [028d2410-947f-dcd7-b5af-000000000222] 49915 1727204302.13410: sending task result for task 028d2410-947f-dcd7-b5af-000000000222 49915 1727204302.13469: done sending task result for task 028d2410-947f-dcd7-b5af-000000000222 49915 1727204302.13472: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 49915 1727204302.13559: no more pending results, returning what we have 49915 1727204302.13563: results queue empty 49915 1727204302.13564: checking for any_errors_fatal 49915 1727204302.13570: done checking for any_errors_fatal 49915 1727204302.13570: checking for max_fail_percentage 49915 1727204302.13572: done checking for max_fail_percentage 49915 1727204302.13573: checking to see if all hosts have failed and the running result is not ok 49915 1727204302.13574: done checking to see if all hosts have failed 49915 1727204302.13574: getting the remaining hosts for this loop 49915 1727204302.13578: done getting the remaining hosts for this loop 49915 1727204302.13582: getting the next task for host managed-node2 49915 1727204302.13588: done getting next task for host managed-node2 49915 1727204302.13590: ^ task is: TASK: Delete dummy interface {{ interface }} 49915 1727204302.13594: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204302.13598: getting variables 49915 1727204302.13599: in VariableManager get_vars() 49915 1727204302.13641: Calling all_inventory to load vars for managed-node2 49915 1727204302.13644: Calling groups_inventory to load vars for managed-node2 49915 1727204302.13646: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204302.13659: Calling all_plugins_play to load vars for managed-node2 49915 1727204302.13661: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204302.13664: Calling groups_plugins_play to load vars for managed-node2 49915 1727204302.14036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204302.14234: done with get_vars() 49915 1727204302.14244: done getting variables 49915 1727204302.14300: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49915 1727204302.14408: variable 'interface' from source: play vars TASK [Delete dummy interface lsr101] ******************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Tuesday 24 September 2024 14:58:22 -0400 (0:00:00.025) 0:00:08.850 ***** 49915 1727204302.14435: entering _queue_task() for managed-node2/command 49915 1727204302.14796: worker is 1 (out of 1 available) 49915 1727204302.14806: exiting _queue_task() for managed-node2/command 49915 1727204302.14816: done queuing things up, now waiting for results queue to drain 49915 1727204302.14817: waiting for pending results... 49915 1727204302.15000: running TaskExecutor() for managed-node2/TASK: Delete dummy interface lsr101 49915 1727204302.15099: in run() - task 028d2410-947f-dcd7-b5af-000000000223 49915 1727204302.15103: variable 'ansible_search_path' from source: unknown 49915 1727204302.15106: variable 'ansible_search_path' from source: unknown 49915 1727204302.15125: calling self._execute() 49915 1727204302.15213: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204302.15259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204302.15263: variable 'omit' from source: magic vars 49915 1727204302.15598: variable 'ansible_distribution_major_version' from source: facts 49915 1727204302.15616: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204302.15827: variable 'type' from source: play vars 49915 1727204302.15838: variable 'state' from source: include params 49915 1727204302.15858: variable 'interface' from source: play vars 49915 1727204302.15862: variable 'current_interfaces' from source: set_fact 49915 1727204302.15881: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 49915 1727204302.15883: when evaluation is False, skipping this task 49915 1727204302.15885: _execute() done 49915 1727204302.15908: dumping result to json 49915 1727204302.15911: done dumping result, returning 49915 1727204302.15913: done running TaskExecutor() for managed-node2/TASK: Delete dummy interface lsr101 [028d2410-947f-dcd7-b5af-000000000223] 49915 1727204302.15921: sending task result for task 028d2410-947f-dcd7-b5af-000000000223 49915 1727204302.16123: done sending task result for task 028d2410-947f-dcd7-b5af-000000000223 49915 1727204302.16126: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 49915 1727204302.16173: no more pending results, returning what we have 49915 1727204302.16181: results queue empty 49915 1727204302.16183: checking for any_errors_fatal 49915 1727204302.16188: done checking for any_errors_fatal 49915 1727204302.16189: checking for max_fail_percentage 49915 1727204302.16191: done checking for max_fail_percentage 49915 1727204302.16192: checking to see if all hosts have failed and the running result is not ok 49915 1727204302.16193: done checking to see if all hosts have failed 49915 1727204302.16193: getting the remaining hosts for this loop 49915 1727204302.16195: done getting the remaining hosts for this loop 49915 1727204302.16199: getting the next task for host managed-node2 49915 1727204302.16206: done getting next task for host managed-node2 49915 1727204302.16208: ^ task is: TASK: Create tap interface {{ interface }} 49915 1727204302.16212: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204302.16215: getting variables 49915 1727204302.16217: in VariableManager get_vars() 49915 1727204302.16261: Calling all_inventory to load vars for managed-node2 49915 1727204302.16264: Calling groups_inventory to load vars for managed-node2 49915 1727204302.16266: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204302.16417: Calling all_plugins_play to load vars for managed-node2 49915 1727204302.16421: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204302.16425: Calling groups_plugins_play to load vars for managed-node2 49915 1727204302.16653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204302.16867: done with get_vars() 49915 1727204302.16880: done getting variables 49915 1727204302.16940: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49915 1727204302.17062: variable 'interface' from source: play vars TASK [Create tap interface lsr101] ********************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Tuesday 24 September 2024 14:58:22 -0400 (0:00:00.026) 0:00:08.877 ***** 49915 1727204302.17101: entering _queue_task() for managed-node2/command 49915 1727204302.17371: worker is 1 (out of 1 available) 49915 1727204302.17385: exiting _queue_task() for managed-node2/command 49915 1727204302.17483: done queuing things up, now waiting for results queue to drain 49915 1727204302.17485: waiting for pending results... 49915 1727204302.17735: running TaskExecutor() for managed-node2/TASK: Create tap interface lsr101 49915 1727204302.17755: in run() - task 028d2410-947f-dcd7-b5af-000000000224 49915 1727204302.17772: variable 'ansible_search_path' from source: unknown 49915 1727204302.17781: variable 'ansible_search_path' from source: unknown 49915 1727204302.17819: calling self._execute() 49915 1727204302.17912: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204302.17923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204302.17941: variable 'omit' from source: magic vars 49915 1727204302.18294: variable 'ansible_distribution_major_version' from source: facts 49915 1727204302.18311: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204302.18580: variable 'type' from source: play vars 49915 1727204302.18583: variable 'state' from source: include params 49915 1727204302.18587: variable 'interface' from source: play vars 49915 1727204302.18589: variable 'current_interfaces' from source: set_fact 49915 1727204302.18592: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 49915 1727204302.18594: when evaluation is False, skipping this task 49915 1727204302.18596: _execute() done 49915 1727204302.18598: dumping result to json 49915 1727204302.18601: done dumping result, returning 49915 1727204302.18603: done running TaskExecutor() for managed-node2/TASK: Create tap interface lsr101 [028d2410-947f-dcd7-b5af-000000000224] 49915 1727204302.18605: sending task result for task 028d2410-947f-dcd7-b5af-000000000224 49915 1727204302.18881: done sending task result for task 028d2410-947f-dcd7-b5af-000000000224 49915 1727204302.18885: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 49915 1727204302.18922: no more pending results, returning what we have 49915 1727204302.18925: results queue empty 49915 1727204302.18926: checking for any_errors_fatal 49915 1727204302.18930: done checking for any_errors_fatal 49915 1727204302.18930: checking for max_fail_percentage 49915 1727204302.18932: done checking for max_fail_percentage 49915 1727204302.18933: checking to see if all hosts have failed and the running result is not ok 49915 1727204302.18934: done checking to see if all hosts have failed 49915 1727204302.18935: getting the remaining hosts for this loop 49915 1727204302.18936: done getting the remaining hosts for this loop 49915 1727204302.18939: getting the next task for host managed-node2 49915 1727204302.18944: done getting next task for host managed-node2 49915 1727204302.18946: ^ task is: TASK: Delete tap interface {{ interface }} 49915 1727204302.18949: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204302.18952: getting variables 49915 1727204302.18953: in VariableManager get_vars() 49915 1727204302.18989: Calling all_inventory to load vars for managed-node2 49915 1727204302.18992: Calling groups_inventory to load vars for managed-node2 49915 1727204302.19020: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204302.19030: Calling all_plugins_play to load vars for managed-node2 49915 1727204302.19032: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204302.19035: Calling groups_plugins_play to load vars for managed-node2 49915 1727204302.19209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204302.19407: done with get_vars() 49915 1727204302.19416: done getting variables 49915 1727204302.19478: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49915 1727204302.19590: variable 'interface' from source: play vars TASK [Delete tap interface lsr101] ********************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Tuesday 24 September 2024 14:58:22 -0400 (0:00:00.025) 0:00:08.902 ***** 49915 1727204302.19617: entering _queue_task() for managed-node2/command 49915 1727204302.20081: worker is 1 (out of 1 available) 49915 1727204302.20090: exiting _queue_task() for managed-node2/command 49915 1727204302.20099: done queuing things up, now waiting for results queue to drain 49915 1727204302.20100: waiting for pending results... 49915 1727204302.20145: running TaskExecutor() for managed-node2/TASK: Delete tap interface lsr101 49915 1727204302.20241: in run() - task 028d2410-947f-dcd7-b5af-000000000225 49915 1727204302.20259: variable 'ansible_search_path' from source: unknown 49915 1727204302.20267: variable 'ansible_search_path' from source: unknown 49915 1727204302.20308: calling self._execute() 49915 1727204302.20399: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204302.20411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204302.20436: variable 'omit' from source: magic vars 49915 1727204302.21187: variable 'ansible_distribution_major_version' from source: facts 49915 1727204302.21191: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204302.21513: variable 'type' from source: play vars 49915 1727204302.21631: variable 'state' from source: include params 49915 1727204302.21641: variable 'interface' from source: play vars 49915 1727204302.21650: variable 'current_interfaces' from source: set_fact 49915 1727204302.21665: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 49915 1727204302.21672: when evaluation is False, skipping this task 49915 1727204302.21737: _execute() done 49915 1727204302.21746: dumping result to json 49915 1727204302.21754: done dumping result, returning 49915 1727204302.21765: done running TaskExecutor() for managed-node2/TASK: Delete tap interface lsr101 [028d2410-947f-dcd7-b5af-000000000225] 49915 1727204302.21781: sending task result for task 028d2410-947f-dcd7-b5af-000000000225 skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 49915 1727204302.21998: no more pending results, returning what we have 49915 1727204302.22002: results queue empty 49915 1727204302.22003: checking for any_errors_fatal 49915 1727204302.22012: done checking for any_errors_fatal 49915 1727204302.22013: checking for max_fail_percentage 49915 1727204302.22015: done checking for max_fail_percentage 49915 1727204302.22016: checking to see if all hosts have failed and the running result is not ok 49915 1727204302.22018: done checking to see if all hosts have failed 49915 1727204302.22018: getting the remaining hosts for this loop 49915 1727204302.22020: done getting the remaining hosts for this loop 49915 1727204302.22024: getting the next task for host managed-node2 49915 1727204302.22033: done getting next task for host managed-node2 49915 1727204302.22037: ^ task is: TASK: Include the task 'assert_device_present.yml' 49915 1727204302.22040: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204302.22044: getting variables 49915 1727204302.22046: in VariableManager get_vars() 49915 1727204302.22096: Calling all_inventory to load vars for managed-node2 49915 1727204302.22099: Calling groups_inventory to load vars for managed-node2 49915 1727204302.22102: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204302.22115: Calling all_plugins_play to load vars for managed-node2 49915 1727204302.22118: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204302.22122: Calling groups_plugins_play to load vars for managed-node2 49915 1727204302.22866: done sending task result for task 028d2410-947f-dcd7-b5af-000000000225 49915 1727204302.22870: WORKER PROCESS EXITING 49915 1727204302.22886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204302.23361: done with get_vars() 49915 1727204302.23372: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:16 Tuesday 24 September 2024 14:58:22 -0400 (0:00:00.039) 0:00:08.941 ***** 49915 1727204302.23557: entering _queue_task() for managed-node2/include_tasks 49915 1727204302.24223: worker is 1 (out of 1 available) 49915 1727204302.24238: exiting _queue_task() for managed-node2/include_tasks 49915 1727204302.24249: done queuing things up, now waiting for results queue to drain 49915 1727204302.24251: waiting for pending results... 49915 1727204302.24663: running TaskExecutor() for managed-node2/TASK: Include the task 'assert_device_present.yml' 49915 1727204302.24765: in run() - task 028d2410-947f-dcd7-b5af-00000000000d 49915 1727204302.24899: variable 'ansible_search_path' from source: unknown 49915 1727204302.24946: calling self._execute() 49915 1727204302.25183: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204302.25187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204302.25200: variable 'omit' from source: magic vars 49915 1727204302.25982: variable 'ansible_distribution_major_version' from source: facts 49915 1727204302.26004: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204302.26020: _execute() done 49915 1727204302.26030: dumping result to json 49915 1727204302.26040: done dumping result, returning 49915 1727204302.26052: done running TaskExecutor() for managed-node2/TASK: Include the task 'assert_device_present.yml' [028d2410-947f-dcd7-b5af-00000000000d] 49915 1727204302.26381: sending task result for task 028d2410-947f-dcd7-b5af-00000000000d 49915 1727204302.26461: done sending task result for task 028d2410-947f-dcd7-b5af-00000000000d 49915 1727204302.26465: WORKER PROCESS EXITING 49915 1727204302.26495: no more pending results, returning what we have 49915 1727204302.26499: in VariableManager get_vars() 49915 1727204302.26549: Calling all_inventory to load vars for managed-node2 49915 1727204302.26552: Calling groups_inventory to load vars for managed-node2 49915 1727204302.26555: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204302.26568: Calling all_plugins_play to load vars for managed-node2 49915 1727204302.26570: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204302.26573: Calling groups_plugins_play to load vars for managed-node2 49915 1727204302.26958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204302.27208: done with get_vars() 49915 1727204302.27331: variable 'ansible_search_path' from source: unknown 49915 1727204302.27345: we have included files to process 49915 1727204302.27346: generating all_blocks data 49915 1727204302.27347: done generating all_blocks data 49915 1727204302.27352: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 49915 1727204302.27353: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 49915 1727204302.27356: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 49915 1727204302.27604: in VariableManager get_vars() 49915 1727204302.27627: done with get_vars() 49915 1727204302.27768: done processing included file 49915 1727204302.27770: iterating over new_blocks loaded from include file 49915 1727204302.27771: in VariableManager get_vars() 49915 1727204302.27790: done with get_vars() 49915 1727204302.27792: filtering new block on tags 49915 1727204302.27809: done filtering new block on tags 49915 1727204302.27811: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed-node2 49915 1727204302.27817: extending task lists for all hosts with included blocks 49915 1727204302.30134: done extending task lists 49915 1727204302.30136: done processing included files 49915 1727204302.30137: results queue empty 49915 1727204302.30138: checking for any_errors_fatal 49915 1727204302.30140: done checking for any_errors_fatal 49915 1727204302.30145: checking for max_fail_percentage 49915 1727204302.30147: done checking for max_fail_percentage 49915 1727204302.30148: checking to see if all hosts have failed and the running result is not ok 49915 1727204302.30148: done checking to see if all hosts have failed 49915 1727204302.30149: getting the remaining hosts for this loop 49915 1727204302.30150: done getting the remaining hosts for this loop 49915 1727204302.30153: getting the next task for host managed-node2 49915 1727204302.30157: done getting next task for host managed-node2 49915 1727204302.30159: ^ task is: TASK: Include the task 'get_interface_stat.yml' 49915 1727204302.30161: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204302.30164: getting variables 49915 1727204302.30165: in VariableManager get_vars() 49915 1727204302.30182: Calling all_inventory to load vars for managed-node2 49915 1727204302.30185: Calling groups_inventory to load vars for managed-node2 49915 1727204302.30187: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204302.30193: Calling all_plugins_play to load vars for managed-node2 49915 1727204302.30196: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204302.30199: Calling groups_plugins_play to load vars for managed-node2 49915 1727204302.30580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204302.30772: done with get_vars() 49915 1727204302.30784: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:58:22 -0400 (0:00:00.073) 0:00:09.014 ***** 49915 1727204302.30858: entering _queue_task() for managed-node2/include_tasks 49915 1727204302.31178: worker is 1 (out of 1 available) 49915 1727204302.31190: exiting _queue_task() for managed-node2/include_tasks 49915 1727204302.31202: done queuing things up, now waiting for results queue to drain 49915 1727204302.31203: waiting for pending results... 49915 1727204302.31464: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 49915 1727204302.31556: in run() - task 028d2410-947f-dcd7-b5af-00000000038b 49915 1727204302.31573: variable 'ansible_search_path' from source: unknown 49915 1727204302.31583: variable 'ansible_search_path' from source: unknown 49915 1727204302.31628: calling self._execute() 49915 1727204302.31715: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204302.31727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204302.31739: variable 'omit' from source: magic vars 49915 1727204302.32112: variable 'ansible_distribution_major_version' from source: facts 49915 1727204302.32129: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204302.32145: _execute() done 49915 1727204302.32153: dumping result to json 49915 1727204302.32160: done dumping result, returning 49915 1727204302.32169: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [028d2410-947f-dcd7-b5af-00000000038b] 49915 1727204302.32182: sending task result for task 028d2410-947f-dcd7-b5af-00000000038b 49915 1727204302.32306: no more pending results, returning what we have 49915 1727204302.32312: in VariableManager get_vars() 49915 1727204302.32358: Calling all_inventory to load vars for managed-node2 49915 1727204302.32361: Calling groups_inventory to load vars for managed-node2 49915 1727204302.32363: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204302.32379: Calling all_plugins_play to load vars for managed-node2 49915 1727204302.32383: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204302.32386: Calling groups_plugins_play to load vars for managed-node2 49915 1727204302.32689: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204302.33081: done with get_vars() 49915 1727204302.33089: variable 'ansible_search_path' from source: unknown 49915 1727204302.33090: variable 'ansible_search_path' from source: unknown 49915 1727204302.33107: done sending task result for task 028d2410-947f-dcd7-b5af-00000000038b 49915 1727204302.33110: WORKER PROCESS EXITING 49915 1727204302.33137: we have included files to process 49915 1727204302.33138: generating all_blocks data 49915 1727204302.33140: done generating all_blocks data 49915 1727204302.33141: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 49915 1727204302.33142: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 49915 1727204302.33144: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 49915 1727204302.33370: done processing included file 49915 1727204302.33372: iterating over new_blocks loaded from include file 49915 1727204302.33373: in VariableManager get_vars() 49915 1727204302.33394: done with get_vars() 49915 1727204302.33396: filtering new block on tags 49915 1727204302.33410: done filtering new block on tags 49915 1727204302.33412: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 49915 1727204302.33418: extending task lists for all hosts with included blocks 49915 1727204302.33523: done extending task lists 49915 1727204302.33524: done processing included files 49915 1727204302.33525: results queue empty 49915 1727204302.33526: checking for any_errors_fatal 49915 1727204302.33528: done checking for any_errors_fatal 49915 1727204302.33529: checking for max_fail_percentage 49915 1727204302.33534: done checking for max_fail_percentage 49915 1727204302.33535: checking to see if all hosts have failed and the running result is not ok 49915 1727204302.33536: done checking to see if all hosts have failed 49915 1727204302.33537: getting the remaining hosts for this loop 49915 1727204302.33538: done getting the remaining hosts for this loop 49915 1727204302.33540: getting the next task for host managed-node2 49915 1727204302.33544: done getting next task for host managed-node2 49915 1727204302.33546: ^ task is: TASK: Get stat for interface {{ interface }} 49915 1727204302.33549: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204302.33551: getting variables 49915 1727204302.33552: in VariableManager get_vars() 49915 1727204302.33564: Calling all_inventory to load vars for managed-node2 49915 1727204302.33566: Calling groups_inventory to load vars for managed-node2 49915 1727204302.33568: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204302.33572: Calling all_plugins_play to load vars for managed-node2 49915 1727204302.33574: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204302.33579: Calling groups_plugins_play to load vars for managed-node2 49915 1727204302.33719: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204302.33918: done with get_vars() 49915 1727204302.33926: done getting variables 49915 1727204302.34083: variable 'interface' from source: play vars TASK [Get stat for interface lsr101] ******************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:58:22 -0400 (0:00:00.032) 0:00:09.047 ***** 49915 1727204302.34111: entering _queue_task() for managed-node2/stat 49915 1727204302.34361: worker is 1 (out of 1 available) 49915 1727204302.34373: exiting _queue_task() for managed-node2/stat 49915 1727204302.34387: done queuing things up, now waiting for results queue to drain 49915 1727204302.34388: waiting for pending results... 49915 1727204302.34656: running TaskExecutor() for managed-node2/TASK: Get stat for interface lsr101 49915 1727204302.34765: in run() - task 028d2410-947f-dcd7-b5af-0000000004a4 49915 1727204302.34786: variable 'ansible_search_path' from source: unknown 49915 1727204302.34793: variable 'ansible_search_path' from source: unknown 49915 1727204302.34831: calling self._execute() 49915 1727204302.34920: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204302.34932: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204302.34952: variable 'omit' from source: magic vars 49915 1727204302.35388: variable 'ansible_distribution_major_version' from source: facts 49915 1727204302.35407: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204302.35418: variable 'omit' from source: magic vars 49915 1727204302.35463: variable 'omit' from source: magic vars 49915 1727204302.35599: variable 'interface' from source: play vars 49915 1727204302.35603: variable 'omit' from source: magic vars 49915 1727204302.35640: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204302.35678: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204302.35704: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204302.35732: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204302.35781: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204302.35784: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204302.35793: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204302.35800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204302.35908: Set connection var ansible_connection to ssh 49915 1727204302.35916: Set connection var ansible_shell_type to sh 49915 1727204302.35934: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204302.35979: Set connection var ansible_shell_executable to /bin/sh 49915 1727204302.35982: Set connection var ansible_timeout to 10 49915 1727204302.35984: Set connection var ansible_pipelining to False 49915 1727204302.35990: variable 'ansible_shell_executable' from source: unknown 49915 1727204302.35997: variable 'ansible_connection' from source: unknown 49915 1727204302.36003: variable 'ansible_module_compression' from source: unknown 49915 1727204302.36009: variable 'ansible_shell_type' from source: unknown 49915 1727204302.36015: variable 'ansible_shell_executable' from source: unknown 49915 1727204302.36021: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204302.36035: variable 'ansible_pipelining' from source: unknown 49915 1727204302.36142: variable 'ansible_timeout' from source: unknown 49915 1727204302.36150: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204302.36253: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 49915 1727204302.36282: variable 'omit' from source: magic vars 49915 1727204302.36289: starting attempt loop 49915 1727204302.36297: running the handler 49915 1727204302.36369: _low_level_execute_command(): starting 49915 1727204302.36372: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49915 1727204302.37145: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204302.37187: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204302.37206: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204302.37231: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204302.37348: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204302.39058: stdout chunk (state=3): >>>/root <<< 49915 1727204302.39156: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204302.39220: stderr chunk (state=3): >>><<< 49915 1727204302.39223: stdout chunk (state=3): >>><<< 49915 1727204302.39333: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204302.39338: _low_level_execute_command(): starting 49915 1727204302.39342: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204302.392453-50704-211518841445072 `" && echo ansible-tmp-1727204302.392453-50704-211518841445072="` echo /root/.ansible/tmp/ansible-tmp-1727204302.392453-50704-211518841445072 `" ) && sleep 0' 49915 1727204302.39863: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204302.39886: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204302.39904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204302.39922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204302.39941: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 49915 1727204302.39953: stderr chunk (state=3): >>>debug2: match not found <<< 49915 1727204302.39979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204302.40000: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 49915 1727204302.40087: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204302.40108: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204302.40124: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204302.40232: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204302.42428: stdout chunk (state=3): >>>ansible-tmp-1727204302.392453-50704-211518841445072=/root/.ansible/tmp/ansible-tmp-1727204302.392453-50704-211518841445072 <<< 49915 1727204302.42432: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204302.42450: stderr chunk (state=3): >>><<< 49915 1727204302.42458: stdout chunk (state=3): >>><<< 49915 1727204302.42483: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204302.392453-50704-211518841445072=/root/.ansible/tmp/ansible-tmp-1727204302.392453-50704-211518841445072 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204302.42566: variable 'ansible_module_compression' from source: unknown 49915 1727204302.42634: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49915ogiz3nec/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 49915 1727204302.42674: variable 'ansible_facts' from source: unknown 49915 1727204302.42778: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204302.392453-50704-211518841445072/AnsiballZ_stat.py 49915 1727204302.43009: Sending initial data 49915 1727204302.43015: Sent initial data (152 bytes) 49915 1727204302.43529: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204302.43541: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204302.43556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204302.43667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 49915 1727204302.43682: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204302.43695: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204302.43710: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204302.43819: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204302.45448: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49915 1727204302.45558: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49915 1727204302.45633: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49915ogiz3nec/tmpn_u559pb /root/.ansible/tmp/ansible-tmp-1727204302.392453-50704-211518841445072/AnsiballZ_stat.py <<< 49915 1727204302.45644: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204302.392453-50704-211518841445072/AnsiballZ_stat.py" <<< 49915 1727204302.45712: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49915ogiz3nec/tmpn_u559pb" to remote "/root/.ansible/tmp/ansible-tmp-1727204302.392453-50704-211518841445072/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204302.392453-50704-211518841445072/AnsiballZ_stat.py" <<< 49915 1727204302.46646: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204302.46816: stderr chunk (state=3): >>><<< 49915 1727204302.46819: stdout chunk (state=3): >>><<< 49915 1727204302.46821: done transferring module to remote 49915 1727204302.46823: _low_level_execute_command(): starting 49915 1727204302.46825: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204302.392453-50704-211518841445072/ /root/.ansible/tmp/ansible-tmp-1727204302.392453-50704-211518841445072/AnsiballZ_stat.py && sleep 0' 49915 1727204302.47649: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204302.47654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 49915 1727204302.47666: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204302.47758: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204302.47817: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204302.49688: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204302.49722: stderr chunk (state=3): >>><<< 49915 1727204302.49742: stdout chunk (state=3): >>><<< 49915 1727204302.49826: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204302.49830: _low_level_execute_command(): starting 49915 1727204302.49832: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204302.392453-50704-211518841445072/AnsiballZ_stat.py && sleep 0' 49915 1727204302.50541: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204302.50565: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204302.50620: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204302.50639: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204302.50672: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204302.50803: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204302.66154: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr101", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 30380, "dev": 23, "nlink": 1, "atime": 1727204300.8850892, "mtime": 1727204300.8850892, "ctime": 1727204300.8850892, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr101", "lnk_target": "../../devices/virtual/net/lsr101", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr101", "follow": false, "checksum_algorithm": "sha1"}}} <<< 49915 1727204302.67506: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 49915 1727204302.67530: stderr chunk (state=3): >>><<< 49915 1727204302.67533: stdout chunk (state=3): >>><<< 49915 1727204302.67549: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr101", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 30380, "dev": 23, "nlink": 1, "atime": 1727204300.8850892, "mtime": 1727204300.8850892, "ctime": 1727204300.8850892, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr101", "lnk_target": "../../devices/virtual/net/lsr101", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr101", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 49915 1727204302.67590: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/lsr101', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204302.392453-50704-211518841445072/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49915 1727204302.67600: _low_level_execute_command(): starting 49915 1727204302.67603: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204302.392453-50704-211518841445072/ > /dev/null 2>&1 && sleep 0' 49915 1727204302.68022: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204302.68032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204302.68051: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204302.68099: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204302.68106: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204302.68108: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204302.68180: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204302.70037: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204302.70061: stderr chunk (state=3): >>><<< 49915 1727204302.70064: stdout chunk (state=3): >>><<< 49915 1727204302.70078: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204302.70085: handler run complete 49915 1727204302.70122: attempt loop complete, returning result 49915 1727204302.70125: _execute() done 49915 1727204302.70127: dumping result to json 49915 1727204302.70131: done dumping result, returning 49915 1727204302.70139: done running TaskExecutor() for managed-node2/TASK: Get stat for interface lsr101 [028d2410-947f-dcd7-b5af-0000000004a4] 49915 1727204302.70144: sending task result for task 028d2410-947f-dcd7-b5af-0000000004a4 49915 1727204302.70248: done sending task result for task 028d2410-947f-dcd7-b5af-0000000004a4 49915 1727204302.70251: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "atime": 1727204300.8850892, "block_size": 4096, "blocks": 0, "ctime": 1727204300.8850892, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 30380, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/lsr101", "lnk_target": "../../devices/virtual/net/lsr101", "mode": "0777", "mtime": 1727204300.8850892, "nlink": 1, "path": "/sys/class/net/lsr101", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 49915 1727204302.70342: no more pending results, returning what we have 49915 1727204302.70345: results queue empty 49915 1727204302.70346: checking for any_errors_fatal 49915 1727204302.70347: done checking for any_errors_fatal 49915 1727204302.70348: checking for max_fail_percentage 49915 1727204302.70349: done checking for max_fail_percentage 49915 1727204302.70350: checking to see if all hosts have failed and the running result is not ok 49915 1727204302.70351: done checking to see if all hosts have failed 49915 1727204302.70352: getting the remaining hosts for this loop 49915 1727204302.70354: done getting the remaining hosts for this loop 49915 1727204302.70357: getting the next task for host managed-node2 49915 1727204302.70367: done getting next task for host managed-node2 49915 1727204302.70370: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 49915 1727204302.70372: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204302.70382: getting variables 49915 1727204302.70384: in VariableManager get_vars() 49915 1727204302.70479: Calling all_inventory to load vars for managed-node2 49915 1727204302.70482: Calling groups_inventory to load vars for managed-node2 49915 1727204302.70484: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204302.70499: Calling all_plugins_play to load vars for managed-node2 49915 1727204302.70501: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204302.70503: Calling groups_plugins_play to load vars for managed-node2 49915 1727204302.70609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204302.70733: done with get_vars() 49915 1727204302.70741: done getting variables 49915 1727204302.70813: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 49915 1727204302.70900: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'lsr101'] ************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:58:22 -0400 (0:00:00.368) 0:00:09.415 ***** 49915 1727204302.70922: entering _queue_task() for managed-node2/assert 49915 1727204302.70924: Creating lock for assert 49915 1727204302.71131: worker is 1 (out of 1 available) 49915 1727204302.71143: exiting _queue_task() for managed-node2/assert 49915 1727204302.71155: done queuing things up, now waiting for results queue to drain 49915 1727204302.71156: waiting for pending results... 49915 1727204302.71321: running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'lsr101' 49915 1727204302.71390: in run() - task 028d2410-947f-dcd7-b5af-00000000038c 49915 1727204302.71399: variable 'ansible_search_path' from source: unknown 49915 1727204302.71403: variable 'ansible_search_path' from source: unknown 49915 1727204302.71432: calling self._execute() 49915 1727204302.71499: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204302.71503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204302.71516: variable 'omit' from source: magic vars 49915 1727204302.71770: variable 'ansible_distribution_major_version' from source: facts 49915 1727204302.71781: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204302.71787: variable 'omit' from source: magic vars 49915 1727204302.71809: variable 'omit' from source: magic vars 49915 1727204302.71880: variable 'interface' from source: play vars 49915 1727204302.71894: variable 'omit' from source: magic vars 49915 1727204302.71931: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204302.71955: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204302.71970: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204302.71985: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204302.71994: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204302.72019: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204302.72022: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204302.72025: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204302.72090: Set connection var ansible_connection to ssh 49915 1727204302.72094: Set connection var ansible_shell_type to sh 49915 1727204302.72099: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204302.72107: Set connection var ansible_shell_executable to /bin/sh 49915 1727204302.72114: Set connection var ansible_timeout to 10 49915 1727204302.72119: Set connection var ansible_pipelining to False 49915 1727204302.72147: variable 'ansible_shell_executable' from source: unknown 49915 1727204302.72152: variable 'ansible_connection' from source: unknown 49915 1727204302.72155: variable 'ansible_module_compression' from source: unknown 49915 1727204302.72158: variable 'ansible_shell_type' from source: unknown 49915 1727204302.72160: variable 'ansible_shell_executable' from source: unknown 49915 1727204302.72162: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204302.72164: variable 'ansible_pipelining' from source: unknown 49915 1727204302.72167: variable 'ansible_timeout' from source: unknown 49915 1727204302.72170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204302.72267: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204302.72277: variable 'omit' from source: magic vars 49915 1727204302.72283: starting attempt loop 49915 1727204302.72286: running the handler 49915 1727204302.72370: variable 'interface_stat' from source: set_fact 49915 1727204302.72391: Evaluated conditional (interface_stat.stat.exists): True 49915 1727204302.72394: handler run complete 49915 1727204302.72404: attempt loop complete, returning result 49915 1727204302.72407: _execute() done 49915 1727204302.72410: dumping result to json 49915 1727204302.72414: done dumping result, returning 49915 1727204302.72417: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'lsr101' [028d2410-947f-dcd7-b5af-00000000038c] 49915 1727204302.72423: sending task result for task 028d2410-947f-dcd7-b5af-00000000038c 49915 1727204302.72501: done sending task result for task 028d2410-947f-dcd7-b5af-00000000038c 49915 1727204302.72503: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 49915 1727204302.72554: no more pending results, returning what we have 49915 1727204302.72557: results queue empty 49915 1727204302.72558: checking for any_errors_fatal 49915 1727204302.72564: done checking for any_errors_fatal 49915 1727204302.72565: checking for max_fail_percentage 49915 1727204302.72567: done checking for max_fail_percentage 49915 1727204302.72567: checking to see if all hosts have failed and the running result is not ok 49915 1727204302.72568: done checking to see if all hosts have failed 49915 1727204302.72569: getting the remaining hosts for this loop 49915 1727204302.72571: done getting the remaining hosts for this loop 49915 1727204302.72573: getting the next task for host managed-node2 49915 1727204302.72581: done getting next task for host managed-node2 49915 1727204302.72584: ^ task is: TASK: TEST: I can configure the MTU for a vlan interface without autoconnect. 49915 1727204302.72586: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204302.72589: getting variables 49915 1727204302.72590: in VariableManager get_vars() 49915 1727204302.72625: Calling all_inventory to load vars for managed-node2 49915 1727204302.72628: Calling groups_inventory to load vars for managed-node2 49915 1727204302.72630: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204302.72639: Calling all_plugins_play to load vars for managed-node2 49915 1727204302.72641: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204302.72643: Calling groups_plugins_play to load vars for managed-node2 49915 1727204302.72765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204302.72905: done with get_vars() 49915 1727204302.72912: done getting variables 49915 1727204302.72951: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [TEST: I can configure the MTU for a vlan interface without autoconnect.] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:18 Tuesday 24 September 2024 14:58:22 -0400 (0:00:00.020) 0:00:09.435 ***** 49915 1727204302.72969: entering _queue_task() for managed-node2/debug 49915 1727204302.73147: worker is 1 (out of 1 available) 49915 1727204302.73161: exiting _queue_task() for managed-node2/debug 49915 1727204302.73172: done queuing things up, now waiting for results queue to drain 49915 1727204302.73173: waiting for pending results... 49915 1727204302.73335: running TaskExecutor() for managed-node2/TASK: TEST: I can configure the MTU for a vlan interface without autoconnect. 49915 1727204302.73390: in run() - task 028d2410-947f-dcd7-b5af-00000000000e 49915 1727204302.73402: variable 'ansible_search_path' from source: unknown 49915 1727204302.73434: calling self._execute() 49915 1727204302.73497: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204302.73501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204302.73513: variable 'omit' from source: magic vars 49915 1727204302.73771: variable 'ansible_distribution_major_version' from source: facts 49915 1727204302.73783: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204302.73788: variable 'omit' from source: magic vars 49915 1727204302.73802: variable 'omit' from source: magic vars 49915 1727204302.73830: variable 'omit' from source: magic vars 49915 1727204302.73863: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204302.73890: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204302.73906: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204302.73921: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204302.73931: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204302.73955: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204302.73958: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204302.73960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204302.74027: Set connection var ansible_connection to ssh 49915 1727204302.74031: Set connection var ansible_shell_type to sh 49915 1727204302.74036: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204302.74044: Set connection var ansible_shell_executable to /bin/sh 49915 1727204302.74049: Set connection var ansible_timeout to 10 49915 1727204302.74057: Set connection var ansible_pipelining to False 49915 1727204302.74075: variable 'ansible_shell_executable' from source: unknown 49915 1727204302.74080: variable 'ansible_connection' from source: unknown 49915 1727204302.74082: variable 'ansible_module_compression' from source: unknown 49915 1727204302.74085: variable 'ansible_shell_type' from source: unknown 49915 1727204302.74087: variable 'ansible_shell_executable' from source: unknown 49915 1727204302.74089: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204302.74092: variable 'ansible_pipelining' from source: unknown 49915 1727204302.74096: variable 'ansible_timeout' from source: unknown 49915 1727204302.74100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204302.74207: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204302.74219: variable 'omit' from source: magic vars 49915 1727204302.74222: starting attempt loop 49915 1727204302.74225: running the handler 49915 1727204302.74259: handler run complete 49915 1727204302.74273: attempt loop complete, returning result 49915 1727204302.74278: _execute() done 49915 1727204302.74281: dumping result to json 49915 1727204302.74283: done dumping result, returning 49915 1727204302.74292: done running TaskExecutor() for managed-node2/TASK: TEST: I can configure the MTU for a vlan interface without autoconnect. [028d2410-947f-dcd7-b5af-00000000000e] 49915 1727204302.74295: sending task result for task 028d2410-947f-dcd7-b5af-00000000000e 49915 1727204302.74370: done sending task result for task 028d2410-947f-dcd7-b5af-00000000000e 49915 1727204302.74373: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: ################################################## 49915 1727204302.74420: no more pending results, returning what we have 49915 1727204302.74423: results queue empty 49915 1727204302.74424: checking for any_errors_fatal 49915 1727204302.74429: done checking for any_errors_fatal 49915 1727204302.74430: checking for max_fail_percentage 49915 1727204302.74431: done checking for max_fail_percentage 49915 1727204302.74432: checking to see if all hosts have failed and the running result is not ok 49915 1727204302.74433: done checking to see if all hosts have failed 49915 1727204302.74434: getting the remaining hosts for this loop 49915 1727204302.74436: done getting the remaining hosts for this loop 49915 1727204302.74439: getting the next task for host managed-node2 49915 1727204302.74445: done getting next task for host managed-node2 49915 1727204302.74450: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 49915 1727204302.74453: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204302.74465: getting variables 49915 1727204302.74467: in VariableManager get_vars() 49915 1727204302.74505: Calling all_inventory to load vars for managed-node2 49915 1727204302.74508: Calling groups_inventory to load vars for managed-node2 49915 1727204302.74510: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204302.74518: Calling all_plugins_play to load vars for managed-node2 49915 1727204302.74520: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204302.74523: Calling groups_plugins_play to load vars for managed-node2 49915 1727204302.74636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204302.74759: done with get_vars() 49915 1727204302.74767: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:58:22 -0400 (0:00:00.018) 0:00:09.454 ***** 49915 1727204302.74833: entering _queue_task() for managed-node2/include_tasks 49915 1727204302.75012: worker is 1 (out of 1 available) 49915 1727204302.75024: exiting _queue_task() for managed-node2/include_tasks 49915 1727204302.75036: done queuing things up, now waiting for results queue to drain 49915 1727204302.75037: waiting for pending results... 49915 1727204302.75189: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 49915 1727204302.75267: in run() - task 028d2410-947f-dcd7-b5af-000000000016 49915 1727204302.75276: variable 'ansible_search_path' from source: unknown 49915 1727204302.75281: variable 'ansible_search_path' from source: unknown 49915 1727204302.75307: calling self._execute() 49915 1727204302.75367: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204302.75371: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204302.75384: variable 'omit' from source: magic vars 49915 1727204302.75630: variable 'ansible_distribution_major_version' from source: facts 49915 1727204302.75639: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204302.75644: _execute() done 49915 1727204302.75647: dumping result to json 49915 1727204302.75650: done dumping result, returning 49915 1727204302.75657: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [028d2410-947f-dcd7-b5af-000000000016] 49915 1727204302.75662: sending task result for task 028d2410-947f-dcd7-b5af-000000000016 49915 1727204302.75743: done sending task result for task 028d2410-947f-dcd7-b5af-000000000016 49915 1727204302.75746: WORKER PROCESS EXITING 49915 1727204302.75783: no more pending results, returning what we have 49915 1727204302.75787: in VariableManager get_vars() 49915 1727204302.75826: Calling all_inventory to load vars for managed-node2 49915 1727204302.75828: Calling groups_inventory to load vars for managed-node2 49915 1727204302.75830: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204302.75838: Calling all_plugins_play to load vars for managed-node2 49915 1727204302.75840: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204302.75843: Calling groups_plugins_play to load vars for managed-node2 49915 1727204302.75997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204302.76110: done with get_vars() 49915 1727204302.76116: variable 'ansible_search_path' from source: unknown 49915 1727204302.76117: variable 'ansible_search_path' from source: unknown 49915 1727204302.76141: we have included files to process 49915 1727204302.76142: generating all_blocks data 49915 1727204302.76143: done generating all_blocks data 49915 1727204302.76145: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 49915 1727204302.76146: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 49915 1727204302.76147: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 49915 1727204302.76596: done processing included file 49915 1727204302.76598: iterating over new_blocks loaded from include file 49915 1727204302.76599: in VariableManager get_vars() 49915 1727204302.76614: done with get_vars() 49915 1727204302.76615: filtering new block on tags 49915 1727204302.76625: done filtering new block on tags 49915 1727204302.76627: in VariableManager get_vars() 49915 1727204302.76639: done with get_vars() 49915 1727204302.76641: filtering new block on tags 49915 1727204302.76654: done filtering new block on tags 49915 1727204302.76656: in VariableManager get_vars() 49915 1727204302.76669: done with get_vars() 49915 1727204302.76670: filtering new block on tags 49915 1727204302.76682: done filtering new block on tags 49915 1727204302.76683: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 49915 1727204302.76687: extending task lists for all hosts with included blocks 49915 1727204302.77117: done extending task lists 49915 1727204302.77118: done processing included files 49915 1727204302.77118: results queue empty 49915 1727204302.77119: checking for any_errors_fatal 49915 1727204302.77121: done checking for any_errors_fatal 49915 1727204302.77121: checking for max_fail_percentage 49915 1727204302.77122: done checking for max_fail_percentage 49915 1727204302.77122: checking to see if all hosts have failed and the running result is not ok 49915 1727204302.77123: done checking to see if all hosts have failed 49915 1727204302.77123: getting the remaining hosts for this loop 49915 1727204302.77124: done getting the remaining hosts for this loop 49915 1727204302.77126: getting the next task for host managed-node2 49915 1727204302.77128: done getting next task for host managed-node2 49915 1727204302.77130: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 49915 1727204302.77132: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204302.77138: getting variables 49915 1727204302.77138: in VariableManager get_vars() 49915 1727204302.77148: Calling all_inventory to load vars for managed-node2 49915 1727204302.77149: Calling groups_inventory to load vars for managed-node2 49915 1727204302.77150: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204302.77153: Calling all_plugins_play to load vars for managed-node2 49915 1727204302.77155: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204302.77156: Calling groups_plugins_play to load vars for managed-node2 49915 1727204302.77255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204302.77366: done with get_vars() 49915 1727204302.77372: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:58:22 -0400 (0:00:00.025) 0:00:09.480 ***** 49915 1727204302.77422: entering _queue_task() for managed-node2/setup 49915 1727204302.77612: worker is 1 (out of 1 available) 49915 1727204302.77624: exiting _queue_task() for managed-node2/setup 49915 1727204302.77637: done queuing things up, now waiting for results queue to drain 49915 1727204302.77638: waiting for pending results... 49915 1727204302.77793: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 49915 1727204302.77880: in run() - task 028d2410-947f-dcd7-b5af-0000000004bf 49915 1727204302.77891: variable 'ansible_search_path' from source: unknown 49915 1727204302.77894: variable 'ansible_search_path' from source: unknown 49915 1727204302.77926: calling self._execute() 49915 1727204302.77983: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204302.77989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204302.77998: variable 'omit' from source: magic vars 49915 1727204302.78249: variable 'ansible_distribution_major_version' from source: facts 49915 1727204302.78258: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204302.78400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49915 1727204302.79829: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49915 1727204302.79873: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49915 1727204302.79901: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49915 1727204302.79934: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49915 1727204302.79952: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49915 1727204302.80008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204302.80031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204302.80052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204302.80079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204302.80090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204302.80128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204302.80150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204302.80165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204302.80192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204302.80203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204302.80310: variable '__network_required_facts' from source: role '' defaults 49915 1727204302.80320: variable 'ansible_facts' from source: unknown 49915 1727204302.80379: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 49915 1727204302.80383: when evaluation is False, skipping this task 49915 1727204302.80385: _execute() done 49915 1727204302.80387: dumping result to json 49915 1727204302.80390: done dumping result, returning 49915 1727204302.80397: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [028d2410-947f-dcd7-b5af-0000000004bf] 49915 1727204302.80401: sending task result for task 028d2410-947f-dcd7-b5af-0000000004bf 49915 1727204302.80483: done sending task result for task 028d2410-947f-dcd7-b5af-0000000004bf 49915 1727204302.80486: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 49915 1727204302.80531: no more pending results, returning what we have 49915 1727204302.80534: results queue empty 49915 1727204302.80535: checking for any_errors_fatal 49915 1727204302.80536: done checking for any_errors_fatal 49915 1727204302.80536: checking for max_fail_percentage 49915 1727204302.80538: done checking for max_fail_percentage 49915 1727204302.80539: checking to see if all hosts have failed and the running result is not ok 49915 1727204302.80540: done checking to see if all hosts have failed 49915 1727204302.80540: getting the remaining hosts for this loop 49915 1727204302.80542: done getting the remaining hosts for this loop 49915 1727204302.80546: getting the next task for host managed-node2 49915 1727204302.80554: done getting next task for host managed-node2 49915 1727204302.80557: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 49915 1727204302.80561: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204302.80573: getting variables 49915 1727204302.80574: in VariableManager get_vars() 49915 1727204302.80618: Calling all_inventory to load vars for managed-node2 49915 1727204302.80621: Calling groups_inventory to load vars for managed-node2 49915 1727204302.80622: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204302.80630: Calling all_plugins_play to load vars for managed-node2 49915 1727204302.80632: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204302.80635: Calling groups_plugins_play to load vars for managed-node2 49915 1727204302.80764: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204302.80909: done with get_vars() 49915 1727204302.80917: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:58:22 -0400 (0:00:00.035) 0:00:09.516 ***** 49915 1727204302.80990: entering _queue_task() for managed-node2/stat 49915 1727204302.81179: worker is 1 (out of 1 available) 49915 1727204302.81192: exiting _queue_task() for managed-node2/stat 49915 1727204302.81205: done queuing things up, now waiting for results queue to drain 49915 1727204302.81206: waiting for pending results... 49915 1727204302.81360: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 49915 1727204302.81448: in run() - task 028d2410-947f-dcd7-b5af-0000000004c1 49915 1727204302.81459: variable 'ansible_search_path' from source: unknown 49915 1727204302.81462: variable 'ansible_search_path' from source: unknown 49915 1727204302.81491: calling self._execute() 49915 1727204302.81550: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204302.81556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204302.81564: variable 'omit' from source: magic vars 49915 1727204302.81824: variable 'ansible_distribution_major_version' from source: facts 49915 1727204302.81833: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204302.81944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49915 1727204302.82135: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49915 1727204302.82174: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49915 1727204302.82204: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49915 1727204302.82229: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49915 1727204302.82291: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 49915 1727204302.82316: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 49915 1727204302.82333: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204302.82350: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 49915 1727204302.82418: variable '__network_is_ostree' from source: set_fact 49915 1727204302.82421: Evaluated conditional (not __network_is_ostree is defined): False 49915 1727204302.82424: when evaluation is False, skipping this task 49915 1727204302.82426: _execute() done 49915 1727204302.82429: dumping result to json 49915 1727204302.82433: done dumping result, returning 49915 1727204302.82440: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [028d2410-947f-dcd7-b5af-0000000004c1] 49915 1727204302.82445: sending task result for task 028d2410-947f-dcd7-b5af-0000000004c1 49915 1727204302.82522: done sending task result for task 028d2410-947f-dcd7-b5af-0000000004c1 49915 1727204302.82525: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 49915 1727204302.82571: no more pending results, returning what we have 49915 1727204302.82574: results queue empty 49915 1727204302.82577: checking for any_errors_fatal 49915 1727204302.82584: done checking for any_errors_fatal 49915 1727204302.82585: checking for max_fail_percentage 49915 1727204302.82587: done checking for max_fail_percentage 49915 1727204302.82588: checking to see if all hosts have failed and the running result is not ok 49915 1727204302.82589: done checking to see if all hosts have failed 49915 1727204302.82590: getting the remaining hosts for this loop 49915 1727204302.82591: done getting the remaining hosts for this loop 49915 1727204302.82595: getting the next task for host managed-node2 49915 1727204302.82601: done getting next task for host managed-node2 49915 1727204302.82604: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 49915 1727204302.82607: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204302.82622: getting variables 49915 1727204302.82623: in VariableManager get_vars() 49915 1727204302.82655: Calling all_inventory to load vars for managed-node2 49915 1727204302.82657: Calling groups_inventory to load vars for managed-node2 49915 1727204302.82659: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204302.82667: Calling all_plugins_play to load vars for managed-node2 49915 1727204302.82669: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204302.82671: Calling groups_plugins_play to load vars for managed-node2 49915 1727204302.82800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204302.82924: done with get_vars() 49915 1727204302.82931: done getting variables 49915 1727204302.82968: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:58:22 -0400 (0:00:00.020) 0:00:09.536 ***** 49915 1727204302.82994: entering _queue_task() for managed-node2/set_fact 49915 1727204302.83179: worker is 1 (out of 1 available) 49915 1727204302.83191: exiting _queue_task() for managed-node2/set_fact 49915 1727204302.83203: done queuing things up, now waiting for results queue to drain 49915 1727204302.83204: waiting for pending results... 49915 1727204302.83354: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 49915 1727204302.83436: in run() - task 028d2410-947f-dcd7-b5af-0000000004c2 49915 1727204302.83447: variable 'ansible_search_path' from source: unknown 49915 1727204302.83455: variable 'ansible_search_path' from source: unknown 49915 1727204302.83484: calling self._execute() 49915 1727204302.83540: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204302.83546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204302.83558: variable 'omit' from source: magic vars 49915 1727204302.83800: variable 'ansible_distribution_major_version' from source: facts 49915 1727204302.83809: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204302.83919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49915 1727204302.84157: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49915 1727204302.84191: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49915 1727204302.84301: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49915 1727204302.84305: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49915 1727204302.84310: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 49915 1727204302.84320: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 49915 1727204302.84342: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204302.84360: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 49915 1727204302.84431: variable '__network_is_ostree' from source: set_fact 49915 1727204302.84435: Evaluated conditional (not __network_is_ostree is defined): False 49915 1727204302.84437: when evaluation is False, skipping this task 49915 1727204302.84440: _execute() done 49915 1727204302.84443: dumping result to json 49915 1727204302.84445: done dumping result, returning 49915 1727204302.84453: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [028d2410-947f-dcd7-b5af-0000000004c2] 49915 1727204302.84457: sending task result for task 028d2410-947f-dcd7-b5af-0000000004c2 49915 1727204302.84532: done sending task result for task 028d2410-947f-dcd7-b5af-0000000004c2 49915 1727204302.84536: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 49915 1727204302.84578: no more pending results, returning what we have 49915 1727204302.84581: results queue empty 49915 1727204302.84582: checking for any_errors_fatal 49915 1727204302.84587: done checking for any_errors_fatal 49915 1727204302.84588: checking for max_fail_percentage 49915 1727204302.84589: done checking for max_fail_percentage 49915 1727204302.84590: checking to see if all hosts have failed and the running result is not ok 49915 1727204302.84592: done checking to see if all hosts have failed 49915 1727204302.84592: getting the remaining hosts for this loop 49915 1727204302.84594: done getting the remaining hosts for this loop 49915 1727204302.84597: getting the next task for host managed-node2 49915 1727204302.84605: done getting next task for host managed-node2 49915 1727204302.84608: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 49915 1727204302.84611: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204302.84622: getting variables 49915 1727204302.84624: in VariableManager get_vars() 49915 1727204302.84656: Calling all_inventory to load vars for managed-node2 49915 1727204302.84659: Calling groups_inventory to load vars for managed-node2 49915 1727204302.84661: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204302.84668: Calling all_plugins_play to load vars for managed-node2 49915 1727204302.84671: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204302.84673: Calling groups_plugins_play to load vars for managed-node2 49915 1727204302.84837: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204302.84955: done with get_vars() 49915 1727204302.84961: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:58:22 -0400 (0:00:00.020) 0:00:09.556 ***** 49915 1727204302.85025: entering _queue_task() for managed-node2/service_facts 49915 1727204302.85026: Creating lock for service_facts 49915 1727204302.85210: worker is 1 (out of 1 available) 49915 1727204302.85223: exiting _queue_task() for managed-node2/service_facts 49915 1727204302.85235: done queuing things up, now waiting for results queue to drain 49915 1727204302.85236: waiting for pending results... 49915 1727204302.85386: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 49915 1727204302.85460: in run() - task 028d2410-947f-dcd7-b5af-0000000004c4 49915 1727204302.85473: variable 'ansible_search_path' from source: unknown 49915 1727204302.85478: variable 'ansible_search_path' from source: unknown 49915 1727204302.85503: calling self._execute() 49915 1727204302.85563: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204302.85566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204302.85574: variable 'omit' from source: magic vars 49915 1727204302.85837: variable 'ansible_distribution_major_version' from source: facts 49915 1727204302.85845: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204302.85851: variable 'omit' from source: magic vars 49915 1727204302.85898: variable 'omit' from source: magic vars 49915 1727204302.85928: variable 'omit' from source: magic vars 49915 1727204302.85957: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204302.85982: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204302.85997: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204302.86013: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204302.86025: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204302.86048: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204302.86051: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204302.86054: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204302.86120: Set connection var ansible_connection to ssh 49915 1727204302.86125: Set connection var ansible_shell_type to sh 49915 1727204302.86128: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204302.86139: Set connection var ansible_shell_executable to /bin/sh 49915 1727204302.86143: Set connection var ansible_timeout to 10 49915 1727204302.86150: Set connection var ansible_pipelining to False 49915 1727204302.86166: variable 'ansible_shell_executable' from source: unknown 49915 1727204302.86169: variable 'ansible_connection' from source: unknown 49915 1727204302.86171: variable 'ansible_module_compression' from source: unknown 49915 1727204302.86174: variable 'ansible_shell_type' from source: unknown 49915 1727204302.86178: variable 'ansible_shell_executable' from source: unknown 49915 1727204302.86180: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204302.86183: variable 'ansible_pipelining' from source: unknown 49915 1727204302.86185: variable 'ansible_timeout' from source: unknown 49915 1727204302.86190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204302.86330: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 49915 1727204302.86337: variable 'omit' from source: magic vars 49915 1727204302.86342: starting attempt loop 49915 1727204302.86344: running the handler 49915 1727204302.86359: _low_level_execute_command(): starting 49915 1727204302.86365: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49915 1727204302.86868: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204302.86872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204302.86874: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204302.86879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204302.86936: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204302.86939: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204302.86941: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204302.87026: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204302.88708: stdout chunk (state=3): >>>/root <<< 49915 1727204302.88803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204302.88839: stderr chunk (state=3): >>><<< 49915 1727204302.88842: stdout chunk (state=3): >>><<< 49915 1727204302.88858: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204302.88869: _low_level_execute_command(): starting 49915 1727204302.88877: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204302.8885798-50728-172914189976579 `" && echo ansible-tmp-1727204302.8885798-50728-172914189976579="` echo /root/.ansible/tmp/ansible-tmp-1727204302.8885798-50728-172914189976579 `" ) && sleep 0' 49915 1727204302.89323: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204302.89326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 49915 1727204302.89329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204302.89338: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204302.89340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204302.89389: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204302.89396: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204302.89402: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204302.89464: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204302.91360: stdout chunk (state=3): >>>ansible-tmp-1727204302.8885798-50728-172914189976579=/root/.ansible/tmp/ansible-tmp-1727204302.8885798-50728-172914189976579 <<< 49915 1727204302.91469: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204302.91499: stderr chunk (state=3): >>><<< 49915 1727204302.91502: stdout chunk (state=3): >>><<< 49915 1727204302.91517: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204302.8885798-50728-172914189976579=/root/.ansible/tmp/ansible-tmp-1727204302.8885798-50728-172914189976579 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204302.91558: variable 'ansible_module_compression' from source: unknown 49915 1727204302.91595: ANSIBALLZ: Using lock for service_facts 49915 1727204302.91598: ANSIBALLZ: Acquiring lock 49915 1727204302.91601: ANSIBALLZ: Lock acquired: 140698012293104 49915 1727204302.91603: ANSIBALLZ: Creating module 49915 1727204302.99518: ANSIBALLZ: Writing module into payload 49915 1727204302.99585: ANSIBALLZ: Writing module 49915 1727204302.99607: ANSIBALLZ: Renaming module 49915 1727204302.99612: ANSIBALLZ: Done creating module 49915 1727204302.99631: variable 'ansible_facts' from source: unknown 49915 1727204302.99675: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204302.8885798-50728-172914189976579/AnsiballZ_service_facts.py 49915 1727204302.99781: Sending initial data 49915 1727204302.99784: Sent initial data (162 bytes) 49915 1727204303.00250: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204303.00253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204303.00256: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204303.00259: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204303.00261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 49915 1727204303.00262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204303.00321: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204303.00324: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204303.00330: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204303.00407: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204303.02028: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49915 1727204303.02092: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49915 1727204303.02163: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49915ogiz3nec/tmppy988d0l /root/.ansible/tmp/ansible-tmp-1727204302.8885798-50728-172914189976579/AnsiballZ_service_facts.py <<< 49915 1727204303.02170: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204302.8885798-50728-172914189976579/AnsiballZ_service_facts.py" <<< 49915 1727204303.02235: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49915ogiz3nec/tmppy988d0l" to remote "/root/.ansible/tmp/ansible-tmp-1727204302.8885798-50728-172914189976579/AnsiballZ_service_facts.py" <<< 49915 1727204303.02238: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204302.8885798-50728-172914189976579/AnsiballZ_service_facts.py" <<< 49915 1727204303.02897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204303.02942: stderr chunk (state=3): >>><<< 49915 1727204303.02945: stdout chunk (state=3): >>><<< 49915 1727204303.03014: done transferring module to remote 49915 1727204303.03022: _low_level_execute_command(): starting 49915 1727204303.03027: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204302.8885798-50728-172914189976579/ /root/.ansible/tmp/ansible-tmp-1727204302.8885798-50728-172914189976579/AnsiballZ_service_facts.py && sleep 0' 49915 1727204303.03481: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204303.03484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 49915 1727204303.03491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204303.03493: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204303.03499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204303.03539: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204303.03553: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204303.03635: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204303.05426: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204303.05452: stderr chunk (state=3): >>><<< 49915 1727204303.05455: stdout chunk (state=3): >>><<< 49915 1727204303.05469: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204303.05472: _low_level_execute_command(): starting 49915 1727204303.05479: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204302.8885798-50728-172914189976579/AnsiballZ_service_facts.py && sleep 0' 49915 1727204303.05924: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204303.05928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204303.05930: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204303.05932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204303.05982: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204303.05986: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204303.06002: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204303.06071: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204304.63949: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-ma<<< 49915 1727204304.63965: stdout chunk (state=3): >>>rk.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.s<<< 49915 1727204304.64010: stdout chunk (state=3): >>>ervice", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state":<<< 49915 1727204304.64019: stdout chunk (state=3): >>> "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "nmstate.service": {"name": "nmstate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 49915 1727204304.65586: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 49915 1727204304.65619: stderr chunk (state=3): >>><<< 49915 1727204304.65622: stdout chunk (state=3): >>><<< 49915 1727204304.65643: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "nmstate.service": {"name": "nmstate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 49915 1727204304.67238: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204302.8885798-50728-172914189976579/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49915 1727204304.67245: _low_level_execute_command(): starting 49915 1727204304.67250: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204302.8885798-50728-172914189976579/ > /dev/null 2>&1 && sleep 0' 49915 1727204304.67665: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204304.67669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204304.67699: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 49915 1727204304.67702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration <<< 49915 1727204304.67705: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204304.67707: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204304.67763: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204304.67766: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204304.67771: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204304.67846: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204304.69788: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204304.69849: stderr chunk (state=3): >>><<< 49915 1727204304.69854: stdout chunk (state=3): >>><<< 49915 1727204304.69881: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204304.69884: handler run complete 49915 1727204304.70050: variable 'ansible_facts' from source: unknown 49915 1727204304.70145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204304.70558: variable 'ansible_facts' from source: unknown 49915 1727204304.70709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204304.70862: attempt loop complete, returning result 49915 1727204304.70866: _execute() done 49915 1727204304.70868: dumping result to json 49915 1727204304.70905: done dumping result, returning 49915 1727204304.70913: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [028d2410-947f-dcd7-b5af-0000000004c4] 49915 1727204304.70920: sending task result for task 028d2410-947f-dcd7-b5af-0000000004c4 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 49915 1727204304.72229: no more pending results, returning what we have 49915 1727204304.72232: results queue empty 49915 1727204304.72235: checking for any_errors_fatal 49915 1727204304.72239: done checking for any_errors_fatal 49915 1727204304.72240: checking for max_fail_percentage 49915 1727204304.72243: done checking for max_fail_percentage 49915 1727204304.72244: checking to see if all hosts have failed and the running result is not ok 49915 1727204304.72244: done checking to see if all hosts have failed 49915 1727204304.72245: getting the remaining hosts for this loop 49915 1727204304.72246: done getting the remaining hosts for this loop 49915 1727204304.72250: getting the next task for host managed-node2 49915 1727204304.72255: done getting next task for host managed-node2 49915 1727204304.72258: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 49915 1727204304.72262: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204304.72274: done sending task result for task 028d2410-947f-dcd7-b5af-0000000004c4 49915 1727204304.72279: WORKER PROCESS EXITING 49915 1727204304.72285: getting variables 49915 1727204304.72286: in VariableManager get_vars() 49915 1727204304.72309: Calling all_inventory to load vars for managed-node2 49915 1727204304.72311: Calling groups_inventory to load vars for managed-node2 49915 1727204304.72315: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204304.72324: Calling all_plugins_play to load vars for managed-node2 49915 1727204304.72327: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204304.72330: Calling groups_plugins_play to load vars for managed-node2 49915 1727204304.72561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204304.72995: done with get_vars() 49915 1727204304.73008: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:58:24 -0400 (0:00:01.881) 0:00:11.437 ***** 49915 1727204304.73133: entering _queue_task() for managed-node2/package_facts 49915 1727204304.73135: Creating lock for package_facts 49915 1727204304.73421: worker is 1 (out of 1 available) 49915 1727204304.73434: exiting _queue_task() for managed-node2/package_facts 49915 1727204304.73446: done queuing things up, now waiting for results queue to drain 49915 1727204304.73447: waiting for pending results... 49915 1727204304.73660: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 49915 1727204304.73759: in run() - task 028d2410-947f-dcd7-b5af-0000000004c5 49915 1727204304.73773: variable 'ansible_search_path' from source: unknown 49915 1727204304.73783: variable 'ansible_search_path' from source: unknown 49915 1727204304.73819: calling self._execute() 49915 1727204304.73892: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204304.73896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204304.73905: variable 'omit' from source: magic vars 49915 1727204304.74196: variable 'ansible_distribution_major_version' from source: facts 49915 1727204304.74206: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204304.74211: variable 'omit' from source: magic vars 49915 1727204304.74303: variable 'omit' from source: magic vars 49915 1727204304.74317: variable 'omit' from source: magic vars 49915 1727204304.74358: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204304.74391: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204304.74404: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204304.74440: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204304.74446: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204304.74483: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204304.74486: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204304.74488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204304.74548: Set connection var ansible_connection to ssh 49915 1727204304.74551: Set connection var ansible_shell_type to sh 49915 1727204304.74556: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204304.74564: Set connection var ansible_shell_executable to /bin/sh 49915 1727204304.74569: Set connection var ansible_timeout to 10 49915 1727204304.74578: Set connection var ansible_pipelining to False 49915 1727204304.74596: variable 'ansible_shell_executable' from source: unknown 49915 1727204304.74600: variable 'ansible_connection' from source: unknown 49915 1727204304.74602: variable 'ansible_module_compression' from source: unknown 49915 1727204304.74607: variable 'ansible_shell_type' from source: unknown 49915 1727204304.74610: variable 'ansible_shell_executable' from source: unknown 49915 1727204304.74614: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204304.74617: variable 'ansible_pipelining' from source: unknown 49915 1727204304.74619: variable 'ansible_timeout' from source: unknown 49915 1727204304.74621: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204304.74832: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 49915 1727204304.74841: variable 'omit' from source: magic vars 49915 1727204304.74844: starting attempt loop 49915 1727204304.74846: running the handler 49915 1727204304.74860: _low_level_execute_command(): starting 49915 1727204304.74866: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49915 1727204304.75378: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204304.75383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 49915 1727204304.75386: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204304.75443: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204304.75450: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204304.75452: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204304.75526: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204304.77221: stdout chunk (state=3): >>>/root <<< 49915 1727204304.77328: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204304.77353: stderr chunk (state=3): >>><<< 49915 1727204304.77356: stdout chunk (state=3): >>><<< 49915 1727204304.77374: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204304.77386: _low_level_execute_command(): starting 49915 1727204304.77392: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204304.7737398-50800-245886516833127 `" && echo ansible-tmp-1727204304.7737398-50800-245886516833127="` echo /root/.ansible/tmp/ansible-tmp-1727204304.7737398-50800-245886516833127 `" ) && sleep 0' 49915 1727204304.78045: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204304.78096: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204304.78101: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204304.78170: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204304.78259: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204304.80172: stdout chunk (state=3): >>>ansible-tmp-1727204304.7737398-50800-245886516833127=/root/.ansible/tmp/ansible-tmp-1727204304.7737398-50800-245886516833127 <<< 49915 1727204304.80311: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204304.80343: stderr chunk (state=3): >>><<< 49915 1727204304.80346: stdout chunk (state=3): >>><<< 49915 1727204304.80360: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204304.7737398-50800-245886516833127=/root/.ansible/tmp/ansible-tmp-1727204304.7737398-50800-245886516833127 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204304.80397: variable 'ansible_module_compression' from source: unknown 49915 1727204304.80454: ANSIBALLZ: Using lock for package_facts 49915 1727204304.80457: ANSIBALLZ: Acquiring lock 49915 1727204304.80459: ANSIBALLZ: Lock acquired: 140698006178080 49915 1727204304.80461: ANSIBALLZ: Creating module 49915 1727204305.09082: ANSIBALLZ: Writing module into payload 49915 1727204305.09087: ANSIBALLZ: Writing module 49915 1727204305.09123: ANSIBALLZ: Renaming module 49915 1727204305.09137: ANSIBALLZ: Done creating module 49915 1727204305.09181: variable 'ansible_facts' from source: unknown 49915 1727204305.09381: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204304.7737398-50800-245886516833127/AnsiballZ_package_facts.py 49915 1727204305.09608: Sending initial data 49915 1727204305.09614: Sent initial data (162 bytes) 49915 1727204305.10196: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204305.10265: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204305.10322: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204305.10338: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204305.10578: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204305.10880: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204305.12548: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49915 1727204305.12616: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49915 1727204305.12688: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49915ogiz3nec/tmp9o2un69y /root/.ansible/tmp/ansible-tmp-1727204304.7737398-50800-245886516833127/AnsiballZ_package_facts.py <<< 49915 1727204305.12700: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204304.7737398-50800-245886516833127/AnsiballZ_package_facts.py" <<< 49915 1727204305.12781: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49915ogiz3nec/tmp9o2un69y" to remote "/root/.ansible/tmp/ansible-tmp-1727204304.7737398-50800-245886516833127/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204304.7737398-50800-245886516833127/AnsiballZ_package_facts.py" <<< 49915 1727204305.15659: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204305.15672: stdout chunk (state=3): >>><<< 49915 1727204305.15705: stderr chunk (state=3): >>><<< 49915 1727204305.16034: done transferring module to remote 49915 1727204305.16038: _low_level_execute_command(): starting 49915 1727204305.16041: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204304.7737398-50800-245886516833127/ /root/.ansible/tmp/ansible-tmp-1727204304.7737398-50800-245886516833127/AnsiballZ_package_facts.py && sleep 0' 49915 1727204305.17055: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204305.17068: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204305.17183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204305.17195: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204305.17212: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204305.17320: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204305.19371: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204305.19378: stdout chunk (state=3): >>><<< 49915 1727204305.19381: stderr chunk (state=3): >>><<< 49915 1727204305.19397: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204305.19406: _low_level_execute_command(): starting 49915 1727204305.19415: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204304.7737398-50800-245886516833127/AnsiballZ_package_facts.py && sleep 0' 49915 1727204305.20574: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204305.20659: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204305.20668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204305.20690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204305.20702: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 49915 1727204305.20766: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204305.21045: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204305.21048: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204305.21051: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204305.21162: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204305.65992: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 49915 1727204305.66122: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 49915 1727204305.66143: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name":<<< 49915 1727204305.66159: stdout chunk (state=3): >>> "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch<<< 49915 1727204305.66173: stdout chunk (state=3): >>>": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nmstate": [{"name": "nmstate", "version": "2.2.35", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-config-server": [{"name": "NetworkManager-config-server", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "nmstate-libs": [{"name": "nmstate-libs", "version": "2.2.35", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libnmstate": [{"name": "python3-libnmstate", "version": "2.2.35", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 49915 1727204305.68046: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 49915 1727204305.68049: stdout chunk (state=3): >>><<< 49915 1727204305.68052: stderr chunk (state=3): >>><<< 49915 1727204305.68389: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nmstate": [{"name": "nmstate", "version": "2.2.35", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-config-server": [{"name": "NetworkManager-config-server", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "nmstate-libs": [{"name": "nmstate-libs", "version": "2.2.35", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libnmstate": [{"name": "python3-libnmstate", "version": "2.2.35", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 49915 1727204305.73396: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204304.7737398-50800-245886516833127/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49915 1727204305.73429: _low_level_execute_command(): starting 49915 1727204305.73441: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204304.7737398-50800-245886516833127/ > /dev/null 2>&1 && sleep 0' 49915 1727204305.74798: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204305.74802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204305.74804: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 49915 1727204305.74807: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 49915 1727204305.74809: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204305.75082: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204305.75097: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204305.76962: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204305.76997: stderr chunk (state=3): >>><<< 49915 1727204305.77030: stdout chunk (state=3): >>><<< 49915 1727204305.77240: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204305.77243: handler run complete 49915 1727204305.78822: variable 'ansible_facts' from source: unknown 49915 1727204305.79751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204305.83949: variable 'ansible_facts' from source: unknown 49915 1727204305.84911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204305.86377: attempt loop complete, returning result 49915 1727204305.86404: _execute() done 49915 1727204305.86417: dumping result to json 49915 1727204305.86860: done dumping result, returning 49915 1727204305.86874: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [028d2410-947f-dcd7-b5af-0000000004c5] 49915 1727204305.86884: sending task result for task 028d2410-947f-dcd7-b5af-0000000004c5 49915 1727204306.02039: done sending task result for task 028d2410-947f-dcd7-b5af-0000000004c5 49915 1727204306.02043: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 49915 1727204306.02141: no more pending results, returning what we have 49915 1727204306.02144: results queue empty 49915 1727204306.02145: checking for any_errors_fatal 49915 1727204306.02150: done checking for any_errors_fatal 49915 1727204306.02151: checking for max_fail_percentage 49915 1727204306.02152: done checking for max_fail_percentage 49915 1727204306.02153: checking to see if all hosts have failed and the running result is not ok 49915 1727204306.02154: done checking to see if all hosts have failed 49915 1727204306.02155: getting the remaining hosts for this loop 49915 1727204306.02156: done getting the remaining hosts for this loop 49915 1727204306.02160: getting the next task for host managed-node2 49915 1727204306.02166: done getting next task for host managed-node2 49915 1727204306.02170: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 49915 1727204306.02172: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204306.02183: getting variables 49915 1727204306.02185: in VariableManager get_vars() 49915 1727204306.02217: Calling all_inventory to load vars for managed-node2 49915 1727204306.02220: Calling groups_inventory to load vars for managed-node2 49915 1727204306.02221: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204306.02229: Calling all_plugins_play to load vars for managed-node2 49915 1727204306.02231: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204306.02233: Calling groups_plugins_play to load vars for managed-node2 49915 1727204306.04446: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204306.07668: done with get_vars() 49915 1727204306.07899: done getting variables 49915 1727204306.07966: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:58:26 -0400 (0:00:01.348) 0:00:12.786 ***** 49915 1727204306.08008: entering _queue_task() for managed-node2/debug 49915 1727204306.08753: worker is 1 (out of 1 available) 49915 1727204306.08765: exiting _queue_task() for managed-node2/debug 49915 1727204306.08982: done queuing things up, now waiting for results queue to drain 49915 1727204306.08983: waiting for pending results... 49915 1727204306.09296: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 49915 1727204306.09583: in run() - task 028d2410-947f-dcd7-b5af-000000000017 49915 1727204306.09588: variable 'ansible_search_path' from source: unknown 49915 1727204306.09591: variable 'ansible_search_path' from source: unknown 49915 1727204306.09594: calling self._execute() 49915 1727204306.09771: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204306.09784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204306.09796: variable 'omit' from source: magic vars 49915 1727204306.10539: variable 'ansible_distribution_major_version' from source: facts 49915 1727204306.10715: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204306.10719: variable 'omit' from source: magic vars 49915 1727204306.10722: variable 'omit' from source: magic vars 49915 1727204306.10887: variable 'network_provider' from source: set_fact 49915 1727204306.10950: variable 'omit' from source: magic vars 49915 1727204306.11149: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204306.11152: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204306.11155: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204306.11484: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204306.11488: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204306.11492: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204306.11495: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204306.11498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204306.11532: Set connection var ansible_connection to ssh 49915 1727204306.11539: Set connection var ansible_shell_type to sh 49915 1727204306.11550: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204306.11563: Set connection var ansible_shell_executable to /bin/sh 49915 1727204306.11599: Set connection var ansible_timeout to 10 49915 1727204306.11615: Set connection var ansible_pipelining to False 49915 1727204306.11635: variable 'ansible_shell_executable' from source: unknown 49915 1727204306.11638: variable 'ansible_connection' from source: unknown 49915 1727204306.11641: variable 'ansible_module_compression' from source: unknown 49915 1727204306.11643: variable 'ansible_shell_type' from source: unknown 49915 1727204306.11646: variable 'ansible_shell_executable' from source: unknown 49915 1727204306.11648: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204306.11650: variable 'ansible_pipelining' from source: unknown 49915 1727204306.11652: variable 'ansible_timeout' from source: unknown 49915 1727204306.11658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204306.11991: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204306.12048: variable 'omit' from source: magic vars 49915 1727204306.12152: starting attempt loop 49915 1727204306.12155: running the handler 49915 1727204306.12202: handler run complete 49915 1727204306.12217: attempt loop complete, returning result 49915 1727204306.12221: _execute() done 49915 1727204306.12224: dumping result to json 49915 1727204306.12226: done dumping result, returning 49915 1727204306.12229: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [028d2410-947f-dcd7-b5af-000000000017] 49915 1727204306.12235: sending task result for task 028d2410-947f-dcd7-b5af-000000000017 49915 1727204306.12497: done sending task result for task 028d2410-947f-dcd7-b5af-000000000017 49915 1727204306.12499: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: Using network provider: nm 49915 1727204306.12557: no more pending results, returning what we have 49915 1727204306.12560: results queue empty 49915 1727204306.12561: checking for any_errors_fatal 49915 1727204306.12570: done checking for any_errors_fatal 49915 1727204306.12570: checking for max_fail_percentage 49915 1727204306.12572: done checking for max_fail_percentage 49915 1727204306.12573: checking to see if all hosts have failed and the running result is not ok 49915 1727204306.12574: done checking to see if all hosts have failed 49915 1727204306.12576: getting the remaining hosts for this loop 49915 1727204306.12578: done getting the remaining hosts for this loop 49915 1727204306.12582: getting the next task for host managed-node2 49915 1727204306.12588: done getting next task for host managed-node2 49915 1727204306.12592: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 49915 1727204306.12595: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204306.12606: getting variables 49915 1727204306.12609: in VariableManager get_vars() 49915 1727204306.12655: Calling all_inventory to load vars for managed-node2 49915 1727204306.12657: Calling groups_inventory to load vars for managed-node2 49915 1727204306.12661: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204306.12670: Calling all_plugins_play to load vars for managed-node2 49915 1727204306.12673: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204306.12882: Calling groups_plugins_play to load vars for managed-node2 49915 1727204306.15448: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204306.18855: done with get_vars() 49915 1727204306.18888: done getting variables 49915 1727204306.19257: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:58:26 -0400 (0:00:00.112) 0:00:12.899 ***** 49915 1727204306.19296: entering _queue_task() for managed-node2/fail 49915 1727204306.19732: worker is 1 (out of 1 available) 49915 1727204306.19745: exiting _queue_task() for managed-node2/fail 49915 1727204306.19757: done queuing things up, now waiting for results queue to drain 49915 1727204306.19759: waiting for pending results... 49915 1727204306.20899: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 49915 1727204306.20904: in run() - task 028d2410-947f-dcd7-b5af-000000000018 49915 1727204306.20908: variable 'ansible_search_path' from source: unknown 49915 1727204306.20910: variable 'ansible_search_path' from source: unknown 49915 1727204306.21187: calling self._execute() 49915 1727204306.21267: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204306.21302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204306.21311: variable 'omit' from source: magic vars 49915 1727204306.21903: variable 'ansible_distribution_major_version' from source: facts 49915 1727204306.21923: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204306.22164: variable 'network_state' from source: role '' defaults 49915 1727204306.22187: Evaluated conditional (network_state != {}): False 49915 1727204306.22196: when evaluation is False, skipping this task 49915 1727204306.22204: _execute() done 49915 1727204306.22212: dumping result to json 49915 1727204306.22220: done dumping result, returning 49915 1727204306.22230: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [028d2410-947f-dcd7-b5af-000000000018] 49915 1727204306.22241: sending task result for task 028d2410-947f-dcd7-b5af-000000000018 49915 1727204306.22582: done sending task result for task 028d2410-947f-dcd7-b5af-000000000018 49915 1727204306.22585: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 49915 1727204306.22624: no more pending results, returning what we have 49915 1727204306.22627: results queue empty 49915 1727204306.22628: checking for any_errors_fatal 49915 1727204306.22633: done checking for any_errors_fatal 49915 1727204306.22633: checking for max_fail_percentage 49915 1727204306.22635: done checking for max_fail_percentage 49915 1727204306.22636: checking to see if all hosts have failed and the running result is not ok 49915 1727204306.22637: done checking to see if all hosts have failed 49915 1727204306.22637: getting the remaining hosts for this loop 49915 1727204306.22639: done getting the remaining hosts for this loop 49915 1727204306.22642: getting the next task for host managed-node2 49915 1727204306.22647: done getting next task for host managed-node2 49915 1727204306.22651: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 49915 1727204306.22653: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204306.22665: getting variables 49915 1727204306.22667: in VariableManager get_vars() 49915 1727204306.22708: Calling all_inventory to load vars for managed-node2 49915 1727204306.22714: Calling groups_inventory to load vars for managed-node2 49915 1727204306.22717: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204306.22726: Calling all_plugins_play to load vars for managed-node2 49915 1727204306.22729: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204306.22732: Calling groups_plugins_play to load vars for managed-node2 49915 1727204306.25628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204306.28027: done with get_vars() 49915 1727204306.28174: done getting variables 49915 1727204306.28240: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:58:26 -0400 (0:00:00.089) 0:00:12.988 ***** 49915 1727204306.28274: entering _queue_task() for managed-node2/fail 49915 1727204306.28850: worker is 1 (out of 1 available) 49915 1727204306.28862: exiting _queue_task() for managed-node2/fail 49915 1727204306.28877: done queuing things up, now waiting for results queue to drain 49915 1727204306.28879: waiting for pending results... 49915 1727204306.29490: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 49915 1727204306.29655: in run() - task 028d2410-947f-dcd7-b5af-000000000019 49915 1727204306.29680: variable 'ansible_search_path' from source: unknown 49915 1727204306.29688: variable 'ansible_search_path' from source: unknown 49915 1727204306.29748: calling self._execute() 49915 1727204306.29835: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204306.29846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204306.29857: variable 'omit' from source: magic vars 49915 1727204306.30246: variable 'ansible_distribution_major_version' from source: facts 49915 1727204306.30349: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204306.30411: variable 'network_state' from source: role '' defaults 49915 1727204306.30430: Evaluated conditional (network_state != {}): False 49915 1727204306.30439: when evaluation is False, skipping this task 49915 1727204306.30448: _execute() done 49915 1727204306.30463: dumping result to json 49915 1727204306.30471: done dumping result, returning 49915 1727204306.30487: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [028d2410-947f-dcd7-b5af-000000000019] 49915 1727204306.30498: sending task result for task 028d2410-947f-dcd7-b5af-000000000019 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 49915 1727204306.30764: no more pending results, returning what we have 49915 1727204306.30768: results queue empty 49915 1727204306.30769: checking for any_errors_fatal 49915 1727204306.30779: done checking for any_errors_fatal 49915 1727204306.30780: checking for max_fail_percentage 49915 1727204306.30782: done checking for max_fail_percentage 49915 1727204306.30783: checking to see if all hosts have failed and the running result is not ok 49915 1727204306.30785: done checking to see if all hosts have failed 49915 1727204306.30785: getting the remaining hosts for this loop 49915 1727204306.30787: done getting the remaining hosts for this loop 49915 1727204306.30792: getting the next task for host managed-node2 49915 1727204306.30800: done getting next task for host managed-node2 49915 1727204306.30804: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 49915 1727204306.30807: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204306.30823: getting variables 49915 1727204306.30825: in VariableManager get_vars() 49915 1727204306.30874: Calling all_inventory to load vars for managed-node2 49915 1727204306.30884: Calling groups_inventory to load vars for managed-node2 49915 1727204306.30887: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204306.30893: done sending task result for task 028d2410-947f-dcd7-b5af-000000000019 49915 1727204306.30896: WORKER PROCESS EXITING 49915 1727204306.30907: Calling all_plugins_play to load vars for managed-node2 49915 1727204306.30910: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204306.30913: Calling groups_plugins_play to load vars for managed-node2 49915 1727204306.33882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204306.35990: done with get_vars() 49915 1727204306.36022: done getting variables 49915 1727204306.36101: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:58:26 -0400 (0:00:00.078) 0:00:13.067 ***** 49915 1727204306.36136: entering _queue_task() for managed-node2/fail 49915 1727204306.36579: worker is 1 (out of 1 available) 49915 1727204306.36592: exiting _queue_task() for managed-node2/fail 49915 1727204306.36603: done queuing things up, now waiting for results queue to drain 49915 1727204306.36605: waiting for pending results... 49915 1727204306.36793: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 49915 1727204306.36936: in run() - task 028d2410-947f-dcd7-b5af-00000000001a 49915 1727204306.36960: variable 'ansible_search_path' from source: unknown 49915 1727204306.36969: variable 'ansible_search_path' from source: unknown 49915 1727204306.37281: calling self._execute() 49915 1727204306.37286: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204306.37289: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204306.37501: variable 'omit' from source: magic vars 49915 1727204306.38327: variable 'ansible_distribution_major_version' from source: facts 49915 1727204306.38391: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204306.38761: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49915 1727204306.41849: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49915 1727204306.41907: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49915 1727204306.41936: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49915 1727204306.41963: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49915 1727204306.41984: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49915 1727204306.42045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204306.42066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204306.42085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204306.42115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204306.42125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204306.42197: variable 'ansible_distribution_major_version' from source: facts 49915 1727204306.42210: Evaluated conditional (ansible_distribution_major_version | int > 9): True 49915 1727204306.42293: variable 'ansible_distribution' from source: facts 49915 1727204306.42298: variable '__network_rh_distros' from source: role '' defaults 49915 1727204306.42306: Evaluated conditional (ansible_distribution in __network_rh_distros): True 49915 1727204306.42465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204306.42484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204306.42500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204306.42527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204306.42538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204306.42574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204306.42591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204306.42608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204306.42633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204306.42643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204306.42677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204306.42705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204306.42735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204306.42764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204306.42799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204306.43281: variable 'network_connections' from source: task vars 49915 1727204306.43284: variable 'interface' from source: play vars 49915 1727204306.43287: variable 'interface' from source: play vars 49915 1727204306.43289: variable 'vlan_interface' from source: play vars 49915 1727204306.43291: variable 'vlan_interface' from source: play vars 49915 1727204306.43293: variable 'interface' from source: play vars 49915 1727204306.43381: variable 'interface' from source: play vars 49915 1727204306.43385: variable 'network_state' from source: role '' defaults 49915 1727204306.43400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49915 1727204306.44086: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49915 1727204306.44090: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49915 1727204306.44092: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49915 1727204306.44094: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49915 1727204306.44105: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 49915 1727204306.44107: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 49915 1727204306.44193: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204306.44196: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 49915 1727204306.44199: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 49915 1727204306.44202: when evaluation is False, skipping this task 49915 1727204306.44204: _execute() done 49915 1727204306.44206: dumping result to json 49915 1727204306.44208: done dumping result, returning 49915 1727204306.44210: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [028d2410-947f-dcd7-b5af-00000000001a] 49915 1727204306.44212: sending task result for task 028d2410-947f-dcd7-b5af-00000000001a skipping: [managed-node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 49915 1727204306.44373: no more pending results, returning what we have 49915 1727204306.44378: results queue empty 49915 1727204306.44379: checking for any_errors_fatal 49915 1727204306.44385: done checking for any_errors_fatal 49915 1727204306.44386: checking for max_fail_percentage 49915 1727204306.44388: done checking for max_fail_percentage 49915 1727204306.44389: checking to see if all hosts have failed and the running result is not ok 49915 1727204306.44390: done checking to see if all hosts have failed 49915 1727204306.44391: getting the remaining hosts for this loop 49915 1727204306.44392: done getting the remaining hosts for this loop 49915 1727204306.44396: getting the next task for host managed-node2 49915 1727204306.44406: done getting next task for host managed-node2 49915 1727204306.44411: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 49915 1727204306.44413: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204306.44426: getting variables 49915 1727204306.44513: in VariableManager get_vars() 49915 1727204306.44566: Calling all_inventory to load vars for managed-node2 49915 1727204306.44568: Calling groups_inventory to load vars for managed-node2 49915 1727204306.44571: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204306.44582: Calling all_plugins_play to load vars for managed-node2 49915 1727204306.44584: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204306.44587: Calling groups_plugins_play to load vars for managed-node2 49915 1727204306.44628: done sending task result for task 028d2410-947f-dcd7-b5af-00000000001a 49915 1727204306.44632: WORKER PROCESS EXITING 49915 1727204306.46192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204306.47639: done with get_vars() 49915 1727204306.47667: done getting variables 49915 1727204306.47771: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:58:26 -0400 (0:00:00.116) 0:00:13.184 ***** 49915 1727204306.47805: entering _queue_task() for managed-node2/dnf 49915 1727204306.48138: worker is 1 (out of 1 available) 49915 1727204306.48151: exiting _queue_task() for managed-node2/dnf 49915 1727204306.48164: done queuing things up, now waiting for results queue to drain 49915 1727204306.48165: waiting for pending results... 49915 1727204306.48450: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 49915 1727204306.48597: in run() - task 028d2410-947f-dcd7-b5af-00000000001b 49915 1727204306.48781: variable 'ansible_search_path' from source: unknown 49915 1727204306.48785: variable 'ansible_search_path' from source: unknown 49915 1727204306.48787: calling self._execute() 49915 1727204306.48790: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204306.48794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204306.48796: variable 'omit' from source: magic vars 49915 1727204306.49163: variable 'ansible_distribution_major_version' from source: facts 49915 1727204306.49182: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204306.49385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49915 1727204306.51612: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49915 1727204306.51697: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49915 1727204306.51742: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49915 1727204306.51784: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49915 1727204306.51817: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49915 1727204306.51902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204306.51937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204306.51971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204306.52019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204306.52040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204306.52162: variable 'ansible_distribution' from source: facts 49915 1727204306.52179: variable 'ansible_distribution_major_version' from source: facts 49915 1727204306.52201: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 49915 1727204306.52321: variable '__network_wireless_connections_defined' from source: role '' defaults 49915 1727204306.52457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204306.52489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204306.52525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204306.52568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204306.52607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204306.52639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204306.52716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204306.52719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204306.52743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204306.52762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204306.52807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204306.52840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204306.52880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204306.52924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204306.52950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204306.53152: variable 'network_connections' from source: task vars 49915 1727204306.53155: variable 'interface' from source: play vars 49915 1727204306.53200: variable 'interface' from source: play vars 49915 1727204306.53219: variable 'vlan_interface' from source: play vars 49915 1727204306.53291: variable 'vlan_interface' from source: play vars 49915 1727204306.53304: variable 'interface' from source: play vars 49915 1727204306.53366: variable 'interface' from source: play vars 49915 1727204306.53444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49915 1727204306.53785: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49915 1727204306.53789: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49915 1727204306.53791: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49915 1727204306.53793: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49915 1727204306.53796: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 49915 1727204306.53823: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 49915 1727204306.53854: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204306.53887: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 49915 1727204306.53955: variable '__network_team_connections_defined' from source: role '' defaults 49915 1727204306.54204: variable 'network_connections' from source: task vars 49915 1727204306.54215: variable 'interface' from source: play vars 49915 1727204306.54284: variable 'interface' from source: play vars 49915 1727204306.54299: variable 'vlan_interface' from source: play vars 49915 1727204306.54365: variable 'vlan_interface' from source: play vars 49915 1727204306.54378: variable 'interface' from source: play vars 49915 1727204306.54438: variable 'interface' from source: play vars 49915 1727204306.54484: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 49915 1727204306.54492: when evaluation is False, skipping this task 49915 1727204306.54500: _execute() done 49915 1727204306.54507: dumping result to json 49915 1727204306.54515: done dumping result, returning 49915 1727204306.54527: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [028d2410-947f-dcd7-b5af-00000000001b] 49915 1727204306.54570: sending task result for task 028d2410-947f-dcd7-b5af-00000000001b skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 49915 1727204306.54729: no more pending results, returning what we have 49915 1727204306.54733: results queue empty 49915 1727204306.54734: checking for any_errors_fatal 49915 1727204306.54741: done checking for any_errors_fatal 49915 1727204306.54742: checking for max_fail_percentage 49915 1727204306.54744: done checking for max_fail_percentage 49915 1727204306.54745: checking to see if all hosts have failed and the running result is not ok 49915 1727204306.54746: done checking to see if all hosts have failed 49915 1727204306.54747: getting the remaining hosts for this loop 49915 1727204306.54749: done getting the remaining hosts for this loop 49915 1727204306.54754: getting the next task for host managed-node2 49915 1727204306.54761: done getting next task for host managed-node2 49915 1727204306.54765: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 49915 1727204306.54768: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204306.54784: getting variables 49915 1727204306.54786: in VariableManager get_vars() 49915 1727204306.54835: Calling all_inventory to load vars for managed-node2 49915 1727204306.54838: Calling groups_inventory to load vars for managed-node2 49915 1727204306.54840: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204306.54851: Calling all_plugins_play to load vars for managed-node2 49915 1727204306.54854: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204306.54857: Calling groups_plugins_play to load vars for managed-node2 49915 1727204306.55489: done sending task result for task 028d2410-947f-dcd7-b5af-00000000001b 49915 1727204306.55492: WORKER PROCESS EXITING 49915 1727204306.56477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204306.58015: done with get_vars() 49915 1727204306.58043: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 49915 1727204306.58122: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:58:26 -0400 (0:00:00.103) 0:00:13.287 ***** 49915 1727204306.58154: entering _queue_task() for managed-node2/yum 49915 1727204306.58156: Creating lock for yum 49915 1727204306.58492: worker is 1 (out of 1 available) 49915 1727204306.58504: exiting _queue_task() for managed-node2/yum 49915 1727204306.58517: done queuing things up, now waiting for results queue to drain 49915 1727204306.58518: waiting for pending results... 49915 1727204306.58791: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 49915 1727204306.58927: in run() - task 028d2410-947f-dcd7-b5af-00000000001c 49915 1727204306.58947: variable 'ansible_search_path' from source: unknown 49915 1727204306.58955: variable 'ansible_search_path' from source: unknown 49915 1727204306.58996: calling self._execute() 49915 1727204306.59087: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204306.59100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204306.59117: variable 'omit' from source: magic vars 49915 1727204306.59489: variable 'ansible_distribution_major_version' from source: facts 49915 1727204306.59506: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204306.59684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49915 1727204306.62687: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49915 1727204306.62872: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49915 1727204306.63031: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49915 1727204306.63082: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49915 1727204306.63163: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49915 1727204306.63318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204306.63467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204306.63507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204306.63806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204306.63812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204306.63928: variable 'ansible_distribution_major_version' from source: facts 49915 1727204306.63949: Evaluated conditional (ansible_distribution_major_version | int < 8): False 49915 1727204306.64136: when evaluation is False, skipping this task 49915 1727204306.64139: _execute() done 49915 1727204306.64142: dumping result to json 49915 1727204306.64144: done dumping result, returning 49915 1727204306.64146: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [028d2410-947f-dcd7-b5af-00000000001c] 49915 1727204306.64149: sending task result for task 028d2410-947f-dcd7-b5af-00000000001c 49915 1727204306.64226: done sending task result for task 028d2410-947f-dcd7-b5af-00000000001c 49915 1727204306.64230: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 49915 1727204306.64290: no more pending results, returning what we have 49915 1727204306.64294: results queue empty 49915 1727204306.64295: checking for any_errors_fatal 49915 1727204306.64300: done checking for any_errors_fatal 49915 1727204306.64301: checking for max_fail_percentage 49915 1727204306.64303: done checking for max_fail_percentage 49915 1727204306.64304: checking to see if all hosts have failed and the running result is not ok 49915 1727204306.64305: done checking to see if all hosts have failed 49915 1727204306.64306: getting the remaining hosts for this loop 49915 1727204306.64307: done getting the remaining hosts for this loop 49915 1727204306.64311: getting the next task for host managed-node2 49915 1727204306.64318: done getting next task for host managed-node2 49915 1727204306.64322: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 49915 1727204306.64325: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204306.64340: getting variables 49915 1727204306.64342: in VariableManager get_vars() 49915 1727204306.64387: Calling all_inventory to load vars for managed-node2 49915 1727204306.64391: Calling groups_inventory to load vars for managed-node2 49915 1727204306.64393: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204306.64404: Calling all_plugins_play to load vars for managed-node2 49915 1727204306.64406: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204306.64409: Calling groups_plugins_play to load vars for managed-node2 49915 1727204306.68249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204306.72221: done with get_vars() 49915 1727204306.72252: done getting variables 49915 1727204306.72318: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:58:26 -0400 (0:00:00.141) 0:00:13.429 ***** 49915 1727204306.72352: entering _queue_task() for managed-node2/fail 49915 1727204306.72686: worker is 1 (out of 1 available) 49915 1727204306.72697: exiting _queue_task() for managed-node2/fail 49915 1727204306.72712: done queuing things up, now waiting for results queue to drain 49915 1727204306.72713: waiting for pending results... 49915 1727204306.73056: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 49915 1727204306.73244: in run() - task 028d2410-947f-dcd7-b5af-00000000001d 49915 1727204306.73274: variable 'ansible_search_path' from source: unknown 49915 1727204306.73284: variable 'ansible_search_path' from source: unknown 49915 1727204306.73337: calling self._execute() 49915 1727204306.73430: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204306.73441: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204306.73681: variable 'omit' from source: magic vars 49915 1727204306.73815: variable 'ansible_distribution_major_version' from source: facts 49915 1727204306.73831: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204306.73967: variable '__network_wireless_connections_defined' from source: role '' defaults 49915 1727204306.74187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49915 1727204306.77574: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49915 1727204306.77768: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49915 1727204306.77957: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49915 1727204306.78048: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49915 1727204306.78084: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49915 1727204306.78238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204306.78367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204306.78643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204306.78721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204306.78725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204306.78768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204306.78800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204306.78903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204306.79156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204306.79159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204306.79161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204306.79163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204306.79290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204306.79338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204306.79395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204306.79764: variable 'network_connections' from source: task vars 49915 1727204306.79830: variable 'interface' from source: play vars 49915 1727204306.80029: variable 'interface' from source: play vars 49915 1727204306.80048: variable 'vlan_interface' from source: play vars 49915 1727204306.80356: variable 'vlan_interface' from source: play vars 49915 1727204306.80359: variable 'interface' from source: play vars 49915 1727204306.80362: variable 'interface' from source: play vars 49915 1727204306.80511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49915 1727204306.80909: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49915 1727204306.80951: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49915 1727204306.81005: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49915 1727204306.81113: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49915 1727204306.81231: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 49915 1727204306.81261: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 49915 1727204306.81483: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204306.81486: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 49915 1727204306.81488: variable '__network_team_connections_defined' from source: role '' defaults 49915 1727204306.82082: variable 'network_connections' from source: task vars 49915 1727204306.82372: variable 'interface' from source: play vars 49915 1727204306.82445: variable 'interface' from source: play vars 49915 1727204306.82455: variable 'vlan_interface' from source: play vars 49915 1727204306.82626: variable 'vlan_interface' from source: play vars 49915 1727204306.82632: variable 'interface' from source: play vars 49915 1727204306.82844: variable 'interface' from source: play vars 49915 1727204306.82880: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 49915 1727204306.82890: when evaluation is False, skipping this task 49915 1727204306.82892: _execute() done 49915 1727204306.82895: dumping result to json 49915 1727204306.82898: done dumping result, returning 49915 1727204306.82901: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [028d2410-947f-dcd7-b5af-00000000001d] 49915 1727204306.82903: sending task result for task 028d2410-947f-dcd7-b5af-00000000001d skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 49915 1727204306.83243: no more pending results, returning what we have 49915 1727204306.83246: results queue empty 49915 1727204306.83247: checking for any_errors_fatal 49915 1727204306.83251: done checking for any_errors_fatal 49915 1727204306.83252: checking for max_fail_percentage 49915 1727204306.83254: done checking for max_fail_percentage 49915 1727204306.83255: checking to see if all hosts have failed and the running result is not ok 49915 1727204306.83256: done checking to see if all hosts have failed 49915 1727204306.83256: getting the remaining hosts for this loop 49915 1727204306.83258: done getting the remaining hosts for this loop 49915 1727204306.83261: getting the next task for host managed-node2 49915 1727204306.83267: done getting next task for host managed-node2 49915 1727204306.83273: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 49915 1727204306.83277: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204306.83290: getting variables 49915 1727204306.83291: in VariableManager get_vars() 49915 1727204306.83334: Calling all_inventory to load vars for managed-node2 49915 1727204306.83336: Calling groups_inventory to load vars for managed-node2 49915 1727204306.83338: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204306.83349: Calling all_plugins_play to load vars for managed-node2 49915 1727204306.83351: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204306.83354: Calling groups_plugins_play to load vars for managed-node2 49915 1727204306.84083: done sending task result for task 028d2410-947f-dcd7-b5af-00000000001d 49915 1727204306.84087: WORKER PROCESS EXITING 49915 1727204306.86483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204306.90603: done with get_vars() 49915 1727204306.90631: done getting variables 49915 1727204306.90699: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:58:26 -0400 (0:00:00.183) 0:00:13.613 ***** 49915 1727204306.90738: entering _queue_task() for managed-node2/package 49915 1727204306.91804: worker is 1 (out of 1 available) 49915 1727204306.91821: exiting _queue_task() for managed-node2/package 49915 1727204306.91833: done queuing things up, now waiting for results queue to drain 49915 1727204306.91834: waiting for pending results... 49915 1727204306.92420: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 49915 1727204306.92659: in run() - task 028d2410-947f-dcd7-b5af-00000000001e 49915 1727204306.92697: variable 'ansible_search_path' from source: unknown 49915 1727204306.92759: variable 'ansible_search_path' from source: unknown 49915 1727204306.92859: calling self._execute() 49915 1727204306.92998: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204306.93010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204306.93028: variable 'omit' from source: magic vars 49915 1727204306.93440: variable 'ansible_distribution_major_version' from source: facts 49915 1727204306.93458: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204306.93672: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49915 1727204306.94039: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49915 1727204306.94043: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49915 1727204306.94071: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49915 1727204306.94110: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49915 1727204306.94231: variable 'network_packages' from source: role '' defaults 49915 1727204306.94349: variable '__network_provider_setup' from source: role '' defaults 49915 1727204306.94372: variable '__network_service_name_default_nm' from source: role '' defaults 49915 1727204306.94447: variable '__network_service_name_default_nm' from source: role '' defaults 49915 1727204306.94463: variable '__network_packages_default_nm' from source: role '' defaults 49915 1727204306.94536: variable '__network_packages_default_nm' from source: role '' defaults 49915 1727204306.94744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49915 1727204307.04161: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49915 1727204307.04355: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49915 1727204307.04585: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49915 1727204307.04589: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49915 1727204307.04591: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49915 1727204307.04711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204307.04754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204307.04850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204307.04971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204307.05130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204307.05153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204307.05348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204307.05351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204307.05354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204307.05564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204307.05928: variable '__network_packages_default_gobject_packages' from source: role '' defaults 49915 1727204307.06238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204307.06284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204307.06370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204307.06554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204307.06557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204307.06828: variable 'ansible_python' from source: facts 49915 1727204307.06920: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 49915 1727204307.07169: variable '__network_wpa_supplicant_required' from source: role '' defaults 49915 1727204307.07349: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 49915 1727204307.07509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204307.07540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204307.07569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204307.07624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204307.07644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204307.07695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204307.07746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204307.07777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204307.07832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204307.07852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204307.08037: variable 'network_connections' from source: task vars 49915 1727204307.08042: variable 'interface' from source: play vars 49915 1727204307.08181: variable 'interface' from source: play vars 49915 1727204307.08185: variable 'vlan_interface' from source: play vars 49915 1727204307.08279: variable 'vlan_interface' from source: play vars 49915 1727204307.08293: variable 'interface' from source: play vars 49915 1727204307.08403: variable 'interface' from source: play vars 49915 1727204307.08495: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 49915 1727204307.08585: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 49915 1727204307.08588: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204307.08605: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 49915 1727204307.08650: variable '__network_wireless_connections_defined' from source: role '' defaults 49915 1727204307.08981: variable 'network_connections' from source: task vars 49915 1727204307.08991: variable 'interface' from source: play vars 49915 1727204307.09108: variable 'interface' from source: play vars 49915 1727204307.09135: variable 'vlan_interface' from source: play vars 49915 1727204307.09244: variable 'vlan_interface' from source: play vars 49915 1727204307.09258: variable 'interface' from source: play vars 49915 1727204307.09455: variable 'interface' from source: play vars 49915 1727204307.09458: variable '__network_packages_default_wireless' from source: role '' defaults 49915 1727204307.09532: variable '__network_wireless_connections_defined' from source: role '' defaults 49915 1727204307.09892: variable 'network_connections' from source: task vars 49915 1727204307.09908: variable 'interface' from source: play vars 49915 1727204307.09974: variable 'interface' from source: play vars 49915 1727204307.09999: variable 'vlan_interface' from source: play vars 49915 1727204307.10069: variable 'vlan_interface' from source: play vars 49915 1727204307.10092: variable 'interface' from source: play vars 49915 1727204307.10170: variable 'interface' from source: play vars 49915 1727204307.10225: variable '__network_packages_default_team' from source: role '' defaults 49915 1727204307.10317: variable '__network_team_connections_defined' from source: role '' defaults 49915 1727204307.10725: variable 'network_connections' from source: task vars 49915 1727204307.10728: variable 'interface' from source: play vars 49915 1727204307.10985: variable 'interface' from source: play vars 49915 1727204307.10989: variable 'vlan_interface' from source: play vars 49915 1727204307.11069: variable 'vlan_interface' from source: play vars 49915 1727204307.11287: variable 'interface' from source: play vars 49915 1727204307.11290: variable 'interface' from source: play vars 49915 1727204307.11428: variable '__network_service_name_default_initscripts' from source: role '' defaults 49915 1727204307.11784: variable '__network_service_name_default_initscripts' from source: role '' defaults 49915 1727204307.11788: variable '__network_packages_default_initscripts' from source: role '' defaults 49915 1727204307.11791: variable '__network_packages_default_initscripts' from source: role '' defaults 49915 1727204307.12148: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 49915 1727204307.12960: variable 'network_connections' from source: task vars 49915 1727204307.12970: variable 'interface' from source: play vars 49915 1727204307.13041: variable 'interface' from source: play vars 49915 1727204307.13054: variable 'vlan_interface' from source: play vars 49915 1727204307.13125: variable 'vlan_interface' from source: play vars 49915 1727204307.13137: variable 'interface' from source: play vars 49915 1727204307.13207: variable 'interface' from source: play vars 49915 1727204307.13226: variable 'ansible_distribution' from source: facts 49915 1727204307.13236: variable '__network_rh_distros' from source: role '' defaults 49915 1727204307.13246: variable 'ansible_distribution_major_version' from source: facts 49915 1727204307.13280: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 49915 1727204307.13484: variable 'ansible_distribution' from source: facts 49915 1727204307.13493: variable '__network_rh_distros' from source: role '' defaults 49915 1727204307.13502: variable 'ansible_distribution_major_version' from source: facts 49915 1727204307.13529: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 49915 1727204307.13704: variable 'ansible_distribution' from source: facts 49915 1727204307.13718: variable '__network_rh_distros' from source: role '' defaults 49915 1727204307.13729: variable 'ansible_distribution_major_version' from source: facts 49915 1727204307.13778: variable 'network_provider' from source: set_fact 49915 1727204307.13799: variable 'ansible_facts' from source: unknown 49915 1727204307.14481: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 49915 1727204307.14490: when evaluation is False, skipping this task 49915 1727204307.14497: _execute() done 49915 1727204307.14510: dumping result to json 49915 1727204307.14521: done dumping result, returning 49915 1727204307.14533: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [028d2410-947f-dcd7-b5af-00000000001e] 49915 1727204307.14541: sending task result for task 028d2410-947f-dcd7-b5af-00000000001e skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 49915 1727204307.14730: no more pending results, returning what we have 49915 1727204307.14733: results queue empty 49915 1727204307.14734: checking for any_errors_fatal 49915 1727204307.14741: done checking for any_errors_fatal 49915 1727204307.14741: checking for max_fail_percentage 49915 1727204307.14743: done checking for max_fail_percentage 49915 1727204307.14744: checking to see if all hosts have failed and the running result is not ok 49915 1727204307.14745: done checking to see if all hosts have failed 49915 1727204307.14746: getting the remaining hosts for this loop 49915 1727204307.14747: done getting the remaining hosts for this loop 49915 1727204307.14752: getting the next task for host managed-node2 49915 1727204307.14758: done getting next task for host managed-node2 49915 1727204307.14762: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 49915 1727204307.14769: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204307.14784: getting variables 49915 1727204307.14790: in VariableManager get_vars() 49915 1727204307.14838: Calling all_inventory to load vars for managed-node2 49915 1727204307.14842: Calling groups_inventory to load vars for managed-node2 49915 1727204307.14844: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204307.14854: Calling all_plugins_play to load vars for managed-node2 49915 1727204307.14857: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204307.14860: Calling groups_plugins_play to load vars for managed-node2 49915 1727204307.15891: done sending task result for task 028d2410-947f-dcd7-b5af-00000000001e 49915 1727204307.15895: WORKER PROCESS EXITING 49915 1727204307.23378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204307.25735: done with get_vars() 49915 1727204307.25763: done getting variables 49915 1727204307.25964: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:58:27 -0400 (0:00:00.352) 0:00:13.966 ***** 49915 1727204307.25997: entering _queue_task() for managed-node2/package 49915 1727204307.26752: worker is 1 (out of 1 available) 49915 1727204307.26766: exiting _queue_task() for managed-node2/package 49915 1727204307.26782: done queuing things up, now waiting for results queue to drain 49915 1727204307.26785: waiting for pending results... 49915 1727204307.27288: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 49915 1727204307.27539: in run() - task 028d2410-947f-dcd7-b5af-00000000001f 49915 1727204307.27553: variable 'ansible_search_path' from source: unknown 49915 1727204307.27557: variable 'ansible_search_path' from source: unknown 49915 1727204307.27594: calling self._execute() 49915 1727204307.27851: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204307.27858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204307.27869: variable 'omit' from source: magic vars 49915 1727204307.29011: variable 'ansible_distribution_major_version' from source: facts 49915 1727204307.29023: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204307.29315: variable 'network_state' from source: role '' defaults 49915 1727204307.29393: Evaluated conditional (network_state != {}): False 49915 1727204307.29398: when evaluation is False, skipping this task 49915 1727204307.29401: _execute() done 49915 1727204307.29404: dumping result to json 49915 1727204307.29409: done dumping result, returning 49915 1727204307.29566: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [028d2410-947f-dcd7-b5af-00000000001f] 49915 1727204307.29572: sending task result for task 028d2410-947f-dcd7-b5af-00000000001f 49915 1727204307.29866: done sending task result for task 028d2410-947f-dcd7-b5af-00000000001f 49915 1727204307.29871: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 49915 1727204307.29921: no more pending results, returning what we have 49915 1727204307.29925: results queue empty 49915 1727204307.29926: checking for any_errors_fatal 49915 1727204307.29932: done checking for any_errors_fatal 49915 1727204307.29933: checking for max_fail_percentage 49915 1727204307.29934: done checking for max_fail_percentage 49915 1727204307.29935: checking to see if all hosts have failed and the running result is not ok 49915 1727204307.29936: done checking to see if all hosts have failed 49915 1727204307.29937: getting the remaining hosts for this loop 49915 1727204307.29938: done getting the remaining hosts for this loop 49915 1727204307.29942: getting the next task for host managed-node2 49915 1727204307.29949: done getting next task for host managed-node2 49915 1727204307.29953: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 49915 1727204307.29959: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204307.30087: getting variables 49915 1727204307.30089: in VariableManager get_vars() 49915 1727204307.30137: Calling all_inventory to load vars for managed-node2 49915 1727204307.30142: Calling groups_inventory to load vars for managed-node2 49915 1727204307.30144: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204307.30154: Calling all_plugins_play to load vars for managed-node2 49915 1727204307.30157: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204307.30159: Calling groups_plugins_play to load vars for managed-node2 49915 1727204307.33462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204307.37546: done with get_vars() 49915 1727204307.37788: done getting variables 49915 1727204307.37857: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:58:27 -0400 (0:00:00.118) 0:00:14.085 ***** 49915 1727204307.37897: entering _queue_task() for managed-node2/package 49915 1727204307.39299: worker is 1 (out of 1 available) 49915 1727204307.39310: exiting _queue_task() for managed-node2/package 49915 1727204307.39321: done queuing things up, now waiting for results queue to drain 49915 1727204307.39322: waiting for pending results... 49915 1727204307.39905: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 49915 1727204307.40454: in run() - task 028d2410-947f-dcd7-b5af-000000000020 49915 1727204307.40458: variable 'ansible_search_path' from source: unknown 49915 1727204307.40462: variable 'ansible_search_path' from source: unknown 49915 1727204307.40490: calling self._execute() 49915 1727204307.41007: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204307.41011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204307.41014: variable 'omit' from source: magic vars 49915 1727204307.42244: variable 'ansible_distribution_major_version' from source: facts 49915 1727204307.42255: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204307.42678: variable 'network_state' from source: role '' defaults 49915 1727204307.42991: Evaluated conditional (network_state != {}): False 49915 1727204307.42994: when evaluation is False, skipping this task 49915 1727204307.42996: _execute() done 49915 1727204307.42998: dumping result to json 49915 1727204307.43000: done dumping result, returning 49915 1727204307.43002: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [028d2410-947f-dcd7-b5af-000000000020] 49915 1727204307.43004: sending task result for task 028d2410-947f-dcd7-b5af-000000000020 49915 1727204307.43102: done sending task result for task 028d2410-947f-dcd7-b5af-000000000020 49915 1727204307.43108: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 49915 1727204307.43316: no more pending results, returning what we have 49915 1727204307.43322: results queue empty 49915 1727204307.43323: checking for any_errors_fatal 49915 1727204307.43334: done checking for any_errors_fatal 49915 1727204307.43335: checking for max_fail_percentage 49915 1727204307.43337: done checking for max_fail_percentage 49915 1727204307.43338: checking to see if all hosts have failed and the running result is not ok 49915 1727204307.43339: done checking to see if all hosts have failed 49915 1727204307.43340: getting the remaining hosts for this loop 49915 1727204307.43342: done getting the remaining hosts for this loop 49915 1727204307.43346: getting the next task for host managed-node2 49915 1727204307.43358: done getting next task for host managed-node2 49915 1727204307.43362: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 49915 1727204307.43368: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204307.43390: getting variables 49915 1727204307.43392: in VariableManager get_vars() 49915 1727204307.43442: Calling all_inventory to load vars for managed-node2 49915 1727204307.43445: Calling groups_inventory to load vars for managed-node2 49915 1727204307.43448: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204307.43460: Calling all_plugins_play to load vars for managed-node2 49915 1727204307.43464: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204307.43467: Calling groups_plugins_play to load vars for managed-node2 49915 1727204307.46562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204307.50413: done with get_vars() 49915 1727204307.50444: done getting variables 49915 1727204307.50663: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:58:27 -0400 (0:00:00.128) 0:00:14.213 ***** 49915 1727204307.50698: entering _queue_task() for managed-node2/service 49915 1727204307.50700: Creating lock for service 49915 1727204307.51534: worker is 1 (out of 1 available) 49915 1727204307.51546: exiting _queue_task() for managed-node2/service 49915 1727204307.51558: done queuing things up, now waiting for results queue to drain 49915 1727204307.51560: waiting for pending results... 49915 1727204307.51905: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 49915 1727204307.52181: in run() - task 028d2410-947f-dcd7-b5af-000000000021 49915 1727204307.52184: variable 'ansible_search_path' from source: unknown 49915 1727204307.52187: variable 'ansible_search_path' from source: unknown 49915 1727204307.52189: calling self._execute() 49915 1727204307.52192: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204307.52194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204307.52207: variable 'omit' from source: magic vars 49915 1727204307.52606: variable 'ansible_distribution_major_version' from source: facts 49915 1727204307.52627: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204307.52767: variable '__network_wireless_connections_defined' from source: role '' defaults 49915 1727204307.52991: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49915 1727204307.56218: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49915 1727204307.56296: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49915 1727204307.56355: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49915 1727204307.56400: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49915 1727204307.56432: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49915 1727204307.56516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204307.56551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204307.56585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204307.56633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204307.56653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204307.56712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204307.56799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204307.56807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204307.56880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204307.56894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204307.57016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204307.57020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204307.57023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204307.57047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204307.57066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204307.57250: variable 'network_connections' from source: task vars 49915 1727204307.57269: variable 'interface' from source: play vars 49915 1727204307.57352: variable 'interface' from source: play vars 49915 1727204307.57371: variable 'vlan_interface' from source: play vars 49915 1727204307.57437: variable 'vlan_interface' from source: play vars 49915 1727204307.57455: variable 'interface' from source: play vars 49915 1727204307.57520: variable 'interface' from source: play vars 49915 1727204307.57603: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49915 1727204307.57793: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49915 1727204307.57835: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49915 1727204307.57886: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49915 1727204307.57922: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49915 1727204307.58037: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 49915 1727204307.58063: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 49915 1727204307.58281: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204307.58285: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 49915 1727204307.58536: variable '__network_team_connections_defined' from source: role '' defaults 49915 1727204307.58853: variable 'network_connections' from source: task vars 49915 1727204307.58980: variable 'interface' from source: play vars 49915 1727204307.59051: variable 'interface' from source: play vars 49915 1727204307.59066: variable 'vlan_interface' from source: play vars 49915 1727204307.59138: variable 'vlan_interface' from source: play vars 49915 1727204307.59229: variable 'interface' from source: play vars 49915 1727204307.59301: variable 'interface' from source: play vars 49915 1727204307.59354: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 49915 1727204307.59384: when evaluation is False, skipping this task 49915 1727204307.59392: _execute() done 49915 1727204307.59399: dumping result to json 49915 1727204307.59407: done dumping result, returning 49915 1727204307.59419: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [028d2410-947f-dcd7-b5af-000000000021] 49915 1727204307.59429: sending task result for task 028d2410-947f-dcd7-b5af-000000000021 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 49915 1727204307.59583: no more pending results, returning what we have 49915 1727204307.59591: results queue empty 49915 1727204307.59593: checking for any_errors_fatal 49915 1727204307.59600: done checking for any_errors_fatal 49915 1727204307.59601: checking for max_fail_percentage 49915 1727204307.59603: done checking for max_fail_percentage 49915 1727204307.59604: checking to see if all hosts have failed and the running result is not ok 49915 1727204307.59605: done checking to see if all hosts have failed 49915 1727204307.59606: getting the remaining hosts for this loop 49915 1727204307.59608: done getting the remaining hosts for this loop 49915 1727204307.59612: getting the next task for host managed-node2 49915 1727204307.59619: done getting next task for host managed-node2 49915 1727204307.59623: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 49915 1727204307.59626: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204307.59642: getting variables 49915 1727204307.59644: in VariableManager get_vars() 49915 1727204307.59690: Calling all_inventory to load vars for managed-node2 49915 1727204307.59693: Calling groups_inventory to load vars for managed-node2 49915 1727204307.59695: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204307.59706: Calling all_plugins_play to load vars for managed-node2 49915 1727204307.59710: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204307.59712: Calling groups_plugins_play to load vars for managed-node2 49915 1727204307.60413: done sending task result for task 028d2410-947f-dcd7-b5af-000000000021 49915 1727204307.60418: WORKER PROCESS EXITING 49915 1727204307.61586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204307.63627: done with get_vars() 49915 1727204307.63656: done getting variables 49915 1727204307.63719: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:58:27 -0400 (0:00:00.130) 0:00:14.343 ***** 49915 1727204307.63758: entering _queue_task() for managed-node2/service 49915 1727204307.64440: worker is 1 (out of 1 available) 49915 1727204307.64454: exiting _queue_task() for managed-node2/service 49915 1727204307.64467: done queuing things up, now waiting for results queue to drain 49915 1727204307.64469: waiting for pending results... 49915 1727204307.64878: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 49915 1727204307.65054: in run() - task 028d2410-947f-dcd7-b5af-000000000022 49915 1727204307.65078: variable 'ansible_search_path' from source: unknown 49915 1727204307.65088: variable 'ansible_search_path' from source: unknown 49915 1727204307.65136: calling self._execute() 49915 1727204307.65230: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204307.65242: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204307.65254: variable 'omit' from source: magic vars 49915 1727204307.65638: variable 'ansible_distribution_major_version' from source: facts 49915 1727204307.65655: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204307.65825: variable 'network_provider' from source: set_fact 49915 1727204307.65835: variable 'network_state' from source: role '' defaults 49915 1727204307.65850: Evaluated conditional (network_provider == "nm" or network_state != {}): True 49915 1727204307.65860: variable 'omit' from source: magic vars 49915 1727204307.65924: variable 'omit' from source: magic vars 49915 1727204307.65957: variable 'network_service_name' from source: role '' defaults 49915 1727204307.66035: variable 'network_service_name' from source: role '' defaults 49915 1727204307.66148: variable '__network_provider_setup' from source: role '' defaults 49915 1727204307.66160: variable '__network_service_name_default_nm' from source: role '' defaults 49915 1727204307.66231: variable '__network_service_name_default_nm' from source: role '' defaults 49915 1727204307.66246: variable '__network_packages_default_nm' from source: role '' defaults 49915 1727204307.66316: variable '__network_packages_default_nm' from source: role '' defaults 49915 1727204307.66553: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49915 1727204307.69093: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49915 1727204307.69180: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49915 1727204307.69253: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49915 1727204307.69263: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49915 1727204307.69296: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49915 1727204307.69382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204307.69420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204307.69470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204307.69498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204307.69521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204307.69580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204307.69604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204307.69683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204307.69686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204307.69700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204307.69953: variable '__network_packages_default_gobject_packages' from source: role '' defaults 49915 1727204307.70088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204307.70123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204307.70152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204307.70197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204307.70218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204307.70334: variable 'ansible_python' from source: facts 49915 1727204307.70353: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 49915 1727204307.70549: variable '__network_wpa_supplicant_required' from source: role '' defaults 49915 1727204307.70553: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 49915 1727204307.70670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204307.70701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204307.70734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204307.70784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204307.70803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204307.70855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204307.70897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204307.70929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204307.70972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204307.71000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204307.71148: variable 'network_connections' from source: task vars 49915 1727204307.71161: variable 'interface' from source: play vars 49915 1727204307.71244: variable 'interface' from source: play vars 49915 1727204307.71281: variable 'vlan_interface' from source: play vars 49915 1727204307.71348: variable 'vlan_interface' from source: play vars 49915 1727204307.71363: variable 'interface' from source: play vars 49915 1727204307.71531: variable 'interface' from source: play vars 49915 1727204307.71566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49915 1727204307.71788: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49915 1727204307.71844: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49915 1727204307.71899: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49915 1727204307.71948: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49915 1727204307.72027: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 49915 1727204307.72062: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 49915 1727204307.72105: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204307.72146: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 49915 1727204307.72203: variable '__network_wireless_connections_defined' from source: role '' defaults 49915 1727204307.72501: variable 'network_connections' from source: task vars 49915 1727204307.72580: variable 'interface' from source: play vars 49915 1727204307.72593: variable 'interface' from source: play vars 49915 1727204307.72614: variable 'vlan_interface' from source: play vars 49915 1727204307.72693: variable 'vlan_interface' from source: play vars 49915 1727204307.72710: variable 'interface' from source: play vars 49915 1727204307.72792: variable 'interface' from source: play vars 49915 1727204307.72854: variable '__network_packages_default_wireless' from source: role '' defaults 49915 1727204307.72950: variable '__network_wireless_connections_defined' from source: role '' defaults 49915 1727204307.73271: variable 'network_connections' from source: task vars 49915 1727204307.73274: variable 'interface' from source: play vars 49915 1727204307.73382: variable 'interface' from source: play vars 49915 1727204307.73385: variable 'vlan_interface' from source: play vars 49915 1727204307.73435: variable 'vlan_interface' from source: play vars 49915 1727204307.73447: variable 'interface' from source: play vars 49915 1727204307.73527: variable 'interface' from source: play vars 49915 1727204307.73558: variable '__network_packages_default_team' from source: role '' defaults 49915 1727204307.73647: variable '__network_team_connections_defined' from source: role '' defaults 49915 1727204307.74081: variable 'network_connections' from source: task vars 49915 1727204307.74084: variable 'interface' from source: play vars 49915 1727204307.74086: variable 'interface' from source: play vars 49915 1727204307.74088: variable 'vlan_interface' from source: play vars 49915 1727204307.74128: variable 'vlan_interface' from source: play vars 49915 1727204307.74140: variable 'interface' from source: play vars 49915 1727204307.74219: variable 'interface' from source: play vars 49915 1727204307.74291: variable '__network_service_name_default_initscripts' from source: role '' defaults 49915 1727204307.74360: variable '__network_service_name_default_initscripts' from source: role '' defaults 49915 1727204307.74372: variable '__network_packages_default_initscripts' from source: role '' defaults 49915 1727204307.74442: variable '__network_packages_default_initscripts' from source: role '' defaults 49915 1727204307.74678: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 49915 1727204307.75225: variable 'network_connections' from source: task vars 49915 1727204307.75236: variable 'interface' from source: play vars 49915 1727204307.75305: variable 'interface' from source: play vars 49915 1727204307.75323: variable 'vlan_interface' from source: play vars 49915 1727204307.75387: variable 'vlan_interface' from source: play vars 49915 1727204307.75403: variable 'interface' from source: play vars 49915 1727204307.75467: variable 'interface' from source: play vars 49915 1727204307.75509: variable 'ansible_distribution' from source: facts 49915 1727204307.75515: variable '__network_rh_distros' from source: role '' defaults 49915 1727204307.75518: variable 'ansible_distribution_major_version' from source: facts 49915 1727204307.75539: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 49915 1727204307.75728: variable 'ansible_distribution' from source: facts 49915 1727204307.75782: variable '__network_rh_distros' from source: role '' defaults 49915 1727204307.75785: variable 'ansible_distribution_major_version' from source: facts 49915 1727204307.75788: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 49915 1727204307.75949: variable 'ansible_distribution' from source: facts 49915 1727204307.75959: variable '__network_rh_distros' from source: role '' defaults 49915 1727204307.75968: variable 'ansible_distribution_major_version' from source: facts 49915 1727204307.76011: variable 'network_provider' from source: set_fact 49915 1727204307.76042: variable 'omit' from source: magic vars 49915 1727204307.76081: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204307.76163: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204307.76166: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204307.76169: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204307.76174: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204307.76210: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204307.76222: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204307.76229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204307.76327: Set connection var ansible_connection to ssh 49915 1727204307.76334: Set connection var ansible_shell_type to sh 49915 1727204307.76381: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204307.76384: Set connection var ansible_shell_executable to /bin/sh 49915 1727204307.76386: Set connection var ansible_timeout to 10 49915 1727204307.76388: Set connection var ansible_pipelining to False 49915 1727204307.76408: variable 'ansible_shell_executable' from source: unknown 49915 1727204307.76418: variable 'ansible_connection' from source: unknown 49915 1727204307.76425: variable 'ansible_module_compression' from source: unknown 49915 1727204307.76431: variable 'ansible_shell_type' from source: unknown 49915 1727204307.76436: variable 'ansible_shell_executable' from source: unknown 49915 1727204307.76480: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204307.76483: variable 'ansible_pipelining' from source: unknown 49915 1727204307.76486: variable 'ansible_timeout' from source: unknown 49915 1727204307.76488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204307.76603: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204307.76606: variable 'omit' from source: magic vars 49915 1727204307.76608: starting attempt loop 49915 1727204307.76611: running the handler 49915 1727204307.76677: variable 'ansible_facts' from source: unknown 49915 1727204307.77498: _low_level_execute_command(): starting 49915 1727204307.77510: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49915 1727204307.78296: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204307.78355: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204307.78372: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204307.78397: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204307.78510: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204307.80209: stdout chunk (state=3): >>>/root <<< 49915 1727204307.80381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204307.80384: stdout chunk (state=3): >>><<< 49915 1727204307.80386: stderr chunk (state=3): >>><<< 49915 1727204307.80494: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204307.80498: _low_level_execute_command(): starting 49915 1727204307.80501: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204307.8040762-51021-195642190112450 `" && echo ansible-tmp-1727204307.8040762-51021-195642190112450="` echo /root/.ansible/tmp/ansible-tmp-1727204307.8040762-51021-195642190112450 `" ) && sleep 0' 49915 1727204307.81131: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204307.81136: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204307.81180: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204307.81202: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204307.81231: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204307.81338: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204307.83277: stdout chunk (state=3): >>>ansible-tmp-1727204307.8040762-51021-195642190112450=/root/.ansible/tmp/ansible-tmp-1727204307.8040762-51021-195642190112450 <<< 49915 1727204307.83583: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204307.83586: stdout chunk (state=3): >>><<< 49915 1727204307.83589: stderr chunk (state=3): >>><<< 49915 1727204307.83592: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204307.8040762-51021-195642190112450=/root/.ansible/tmp/ansible-tmp-1727204307.8040762-51021-195642190112450 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204307.83598: variable 'ansible_module_compression' from source: unknown 49915 1727204307.83602: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 49915 1727204307.83604: ANSIBALLZ: Acquiring lock 49915 1727204307.83607: ANSIBALLZ: Lock acquired: 140698012046288 49915 1727204307.83609: ANSIBALLZ: Creating module 49915 1727204308.44082: ANSIBALLZ: Writing module into payload 49915 1727204308.44491: ANSIBALLZ: Writing module 49915 1727204308.44683: ANSIBALLZ: Renaming module 49915 1727204308.44687: ANSIBALLZ: Done creating module 49915 1727204308.44689: variable 'ansible_facts' from source: unknown 49915 1727204308.45062: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204307.8040762-51021-195642190112450/AnsiballZ_systemd.py 49915 1727204308.45447: Sending initial data 49915 1727204308.45450: Sent initial data (156 bytes) 49915 1727204308.46651: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204308.46666: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204308.46683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204308.46701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204308.46748: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 49915 1727204308.46846: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204308.46880: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204308.46904: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204308.46919: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204308.47078: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204308.48761: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49915 1727204308.48917: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49915 1727204308.48994: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49915ogiz3nec/tmpxpql6g_s /root/.ansible/tmp/ansible-tmp-1727204307.8040762-51021-195642190112450/AnsiballZ_systemd.py <<< 49915 1727204308.49007: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204307.8040762-51021-195642190112450/AnsiballZ_systemd.py" <<< 49915 1727204308.49067: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49915ogiz3nec/tmpxpql6g_s" to remote "/root/.ansible/tmp/ansible-tmp-1727204307.8040762-51021-195642190112450/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204307.8040762-51021-195642190112450/AnsiballZ_systemd.py" <<< 49915 1727204308.52023: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204308.52036: stdout chunk (state=3): >>><<< 49915 1727204308.52285: stderr chunk (state=3): >>><<< 49915 1727204308.52289: done transferring module to remote 49915 1727204308.52291: _low_level_execute_command(): starting 49915 1727204308.52293: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204307.8040762-51021-195642190112450/ /root/.ansible/tmp/ansible-tmp-1727204307.8040762-51021-195642190112450/AnsiballZ_systemd.py && sleep 0' 49915 1727204308.53983: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204308.54091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204308.54110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204308.54128: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 49915 1727204308.54268: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204308.54522: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204308.54633: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204308.56486: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204308.56544: stderr chunk (state=3): >>><<< 49915 1727204308.56558: stdout chunk (state=3): >>><<< 49915 1727204308.56595: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204308.56783: _low_level_execute_command(): starting 49915 1727204308.56789: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204307.8040762-51021-195642190112450/AnsiballZ_systemd.py && sleep 0' 49915 1727204308.58085: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204308.58341: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204308.58479: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204308.58502: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204308.58626: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204308.87907: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "7081", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ExecMainStartTimestampMonotonic": "294798591", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ExecMainHandoffTimestampMonotonic": "294813549", "ExecMainPID": "7081", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4312", "MemoryCurrent": "4595712", "MemoryPeak": "7655424", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3313479680", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "1846570000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredum<<< 49915 1727204308.87945: stdout chunk (state=3): >>>pReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target multi-user.target network.target cloud-init.service", "After": "dbus-broker.service cloud-init-local.service network-pre.target basic.target system.slice systemd-journald.socket sysinit.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:56:08 EDT", "StateChangeTimestampMonotonic": "755095855", "InactiveExitTimestamp": "Tue 2024-09-24 14:48:28 EDT", "InactiveExitTimestampMonotonic": "294799297", "ActiveEnterTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ActiveEnterTimestampMonotonic": "294888092", "ActiveExitTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ActiveExitTimestampMonotonic": "294768391", "InactiveEnterTimestamp": "Tue 2024-09-24 14:48:28 EDT", "InactiveEnterTimestampMonotonic": "294795966", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ConditionTimestampMonotonic": "294797207", "AssertTimestamp": "Tue 2024-09-24 14:48:28 EDT", "AssertTimestampMonotonic": "294797210", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "a167241d4c7945a58749ffeda353964d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 49915 1727204308.89899: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 49915 1727204308.89936: stdout chunk (state=3): >>><<< 49915 1727204308.89998: stderr chunk (state=3): >>><<< 49915 1727204308.90104: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "7081", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ExecMainStartTimestampMonotonic": "294798591", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ExecMainHandoffTimestampMonotonic": "294813549", "ExecMainPID": "7081", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4312", "MemoryCurrent": "4595712", "MemoryPeak": "7655424", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3313479680", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "1846570000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target multi-user.target network.target cloud-init.service", "After": "dbus-broker.service cloud-init-local.service network-pre.target basic.target system.slice systemd-journald.socket sysinit.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:56:08 EDT", "StateChangeTimestampMonotonic": "755095855", "InactiveExitTimestamp": "Tue 2024-09-24 14:48:28 EDT", "InactiveExitTimestampMonotonic": "294799297", "ActiveEnterTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ActiveEnterTimestampMonotonic": "294888092", "ActiveExitTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ActiveExitTimestampMonotonic": "294768391", "InactiveEnterTimestamp": "Tue 2024-09-24 14:48:28 EDT", "InactiveEnterTimestampMonotonic": "294795966", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ConditionTimestampMonotonic": "294797207", "AssertTimestamp": "Tue 2024-09-24 14:48:28 EDT", "AssertTimestampMonotonic": "294797210", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "a167241d4c7945a58749ffeda353964d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 49915 1727204308.90259: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204307.8040762-51021-195642190112450/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49915 1727204308.90271: _low_level_execute_command(): starting 49915 1727204308.90284: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204307.8040762-51021-195642190112450/ > /dev/null 2>&1 && sleep 0' 49915 1727204308.91422: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204308.91524: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204308.91547: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204308.91702: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204308.93639: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204308.93683: stdout chunk (state=3): >>><<< 49915 1727204308.93687: stderr chunk (state=3): >>><<< 49915 1727204308.93881: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204308.93884: handler run complete 49915 1727204308.93886: attempt loop complete, returning result 49915 1727204308.93888: _execute() done 49915 1727204308.93890: dumping result to json 49915 1727204308.93911: done dumping result, returning 49915 1727204308.93993: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [028d2410-947f-dcd7-b5af-000000000022] 49915 1727204308.94002: sending task result for task 028d2410-947f-dcd7-b5af-000000000022 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 49915 1727204308.94451: no more pending results, returning what we have 49915 1727204308.94454: results queue empty 49915 1727204308.94455: checking for any_errors_fatal 49915 1727204308.94462: done checking for any_errors_fatal 49915 1727204308.94462: checking for max_fail_percentage 49915 1727204308.94464: done checking for max_fail_percentage 49915 1727204308.94465: checking to see if all hosts have failed and the running result is not ok 49915 1727204308.94467: done checking to see if all hosts have failed 49915 1727204308.94467: getting the remaining hosts for this loop 49915 1727204308.94469: done getting the remaining hosts for this loop 49915 1727204308.94473: getting the next task for host managed-node2 49915 1727204308.94487: done getting next task for host managed-node2 49915 1727204308.94496: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 49915 1727204308.94499: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204308.94515: getting variables 49915 1727204308.94517: in VariableManager get_vars() 49915 1727204308.94562: Calling all_inventory to load vars for managed-node2 49915 1727204308.94565: Calling groups_inventory to load vars for managed-node2 49915 1727204308.94567: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204308.94594: Calling all_plugins_play to load vars for managed-node2 49915 1727204308.94603: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204308.94607: Calling groups_plugins_play to load vars for managed-node2 49915 1727204308.95584: done sending task result for task 028d2410-947f-dcd7-b5af-000000000022 49915 1727204308.95588: WORKER PROCESS EXITING 49915 1727204308.96591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204308.98846: done with get_vars() 49915 1727204308.98869: done getting variables 49915 1727204308.98933: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:58:28 -0400 (0:00:01.352) 0:00:15.695 ***** 49915 1727204308.98967: entering _queue_task() for managed-node2/service 49915 1727204308.99577: worker is 1 (out of 1 available) 49915 1727204308.99589: exiting _queue_task() for managed-node2/service 49915 1727204308.99600: done queuing things up, now waiting for results queue to drain 49915 1727204308.99601: waiting for pending results... 49915 1727204308.99882: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 49915 1727204309.00020: in run() - task 028d2410-947f-dcd7-b5af-000000000023 49915 1727204309.00038: variable 'ansible_search_path' from source: unknown 49915 1727204309.00044: variable 'ansible_search_path' from source: unknown 49915 1727204309.00085: calling self._execute() 49915 1727204309.00183: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204309.00194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204309.00207: variable 'omit' from source: magic vars 49915 1727204309.00958: variable 'ansible_distribution_major_version' from source: facts 49915 1727204309.00984: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204309.01251: variable 'network_provider' from source: set_fact 49915 1727204309.01263: Evaluated conditional (network_provider == "nm"): True 49915 1727204309.01671: variable '__network_wpa_supplicant_required' from source: role '' defaults 49915 1727204309.01699: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 49915 1727204309.01982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49915 1727204309.06459: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49915 1727204309.06680: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49915 1727204309.06728: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49915 1727204309.06772: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49915 1727204309.06815: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49915 1727204309.06936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204309.06973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204309.07008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204309.07065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204309.07086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204309.07140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204309.07171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204309.07204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204309.07252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204309.07271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204309.07317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204309.07349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204309.07458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204309.07462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204309.07464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204309.07589: variable 'network_connections' from source: task vars 49915 1727204309.07608: variable 'interface' from source: play vars 49915 1727204309.07690: variable 'interface' from source: play vars 49915 1727204309.07707: variable 'vlan_interface' from source: play vars 49915 1727204309.07771: variable 'vlan_interface' from source: play vars 49915 1727204309.07788: variable 'interface' from source: play vars 49915 1727204309.07847: variable 'interface' from source: play vars 49915 1727204309.07931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49915 1727204309.08106: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49915 1727204309.08147: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49915 1727204309.08194: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49915 1727204309.08233: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49915 1727204309.08280: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 49915 1727204309.08381: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 49915 1727204309.08384: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204309.08387: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 49915 1727204309.08433: variable '__network_wireless_connections_defined' from source: role '' defaults 49915 1727204309.08879: variable 'network_connections' from source: task vars 49915 1727204309.08883: variable 'interface' from source: play vars 49915 1727204309.08998: variable 'interface' from source: play vars 49915 1727204309.09191: variable 'vlan_interface' from source: play vars 49915 1727204309.09194: variable 'vlan_interface' from source: play vars 49915 1727204309.09197: variable 'interface' from source: play vars 49915 1727204309.09259: variable 'interface' from source: play vars 49915 1727204309.09385: Evaluated conditional (__network_wpa_supplicant_required): False 49915 1727204309.09415: when evaluation is False, skipping this task 49915 1727204309.09423: _execute() done 49915 1727204309.09430: dumping result to json 49915 1727204309.09436: done dumping result, returning 49915 1727204309.09447: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [028d2410-947f-dcd7-b5af-000000000023] 49915 1727204309.09455: sending task result for task 028d2410-947f-dcd7-b5af-000000000023 49915 1727204309.09827: done sending task result for task 028d2410-947f-dcd7-b5af-000000000023 49915 1727204309.09830: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 49915 1727204309.09882: no more pending results, returning what we have 49915 1727204309.09886: results queue empty 49915 1727204309.09887: checking for any_errors_fatal 49915 1727204309.09908: done checking for any_errors_fatal 49915 1727204309.09909: checking for max_fail_percentage 49915 1727204309.09914: done checking for max_fail_percentage 49915 1727204309.09915: checking to see if all hosts have failed and the running result is not ok 49915 1727204309.09916: done checking to see if all hosts have failed 49915 1727204309.09917: getting the remaining hosts for this loop 49915 1727204309.09918: done getting the remaining hosts for this loop 49915 1727204309.09924: getting the next task for host managed-node2 49915 1727204309.09931: done getting next task for host managed-node2 49915 1727204309.09935: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 49915 1727204309.09938: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204309.09952: getting variables 49915 1727204309.09954: in VariableManager get_vars() 49915 1727204309.10000: Calling all_inventory to load vars for managed-node2 49915 1727204309.10003: Calling groups_inventory to load vars for managed-node2 49915 1727204309.10006: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204309.10019: Calling all_plugins_play to load vars for managed-node2 49915 1727204309.10022: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204309.10025: Calling groups_plugins_play to load vars for managed-node2 49915 1727204309.12108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204309.14243: done with get_vars() 49915 1727204309.14269: done getting variables 49915 1727204309.14538: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:58:29 -0400 (0:00:00.156) 0:00:15.852 ***** 49915 1727204309.14649: entering _queue_task() for managed-node2/service 49915 1727204309.15468: worker is 1 (out of 1 available) 49915 1727204309.15483: exiting _queue_task() for managed-node2/service 49915 1727204309.15495: done queuing things up, now waiting for results queue to drain 49915 1727204309.15497: waiting for pending results... 49915 1727204309.15988: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 49915 1727204309.16392: in run() - task 028d2410-947f-dcd7-b5af-000000000024 49915 1727204309.16396: variable 'ansible_search_path' from source: unknown 49915 1727204309.16399: variable 'ansible_search_path' from source: unknown 49915 1727204309.16402: calling self._execute() 49915 1727204309.16607: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204309.16624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204309.16717: variable 'omit' from source: magic vars 49915 1727204309.17582: variable 'ansible_distribution_major_version' from source: facts 49915 1727204309.17586: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204309.17719: variable 'network_provider' from source: set_fact 49915 1727204309.17881: Evaluated conditional (network_provider == "initscripts"): False 49915 1727204309.17884: when evaluation is False, skipping this task 49915 1727204309.17886: _execute() done 49915 1727204309.17888: dumping result to json 49915 1727204309.17890: done dumping result, returning 49915 1727204309.17893: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [028d2410-947f-dcd7-b5af-000000000024] 49915 1727204309.17895: sending task result for task 028d2410-947f-dcd7-b5af-000000000024 49915 1727204309.17964: done sending task result for task 028d2410-947f-dcd7-b5af-000000000024 49915 1727204309.17967: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 49915 1727204309.18018: no more pending results, returning what we have 49915 1727204309.18022: results queue empty 49915 1727204309.18023: checking for any_errors_fatal 49915 1727204309.18035: done checking for any_errors_fatal 49915 1727204309.18036: checking for max_fail_percentage 49915 1727204309.18038: done checking for max_fail_percentage 49915 1727204309.18039: checking to see if all hosts have failed and the running result is not ok 49915 1727204309.18040: done checking to see if all hosts have failed 49915 1727204309.18041: getting the remaining hosts for this loop 49915 1727204309.18043: done getting the remaining hosts for this loop 49915 1727204309.18047: getting the next task for host managed-node2 49915 1727204309.18056: done getting next task for host managed-node2 49915 1727204309.18060: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 49915 1727204309.18063: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204309.18081: getting variables 49915 1727204309.18083: in VariableManager get_vars() 49915 1727204309.18132: Calling all_inventory to load vars for managed-node2 49915 1727204309.18135: Calling groups_inventory to load vars for managed-node2 49915 1727204309.18137: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204309.18149: Calling all_plugins_play to load vars for managed-node2 49915 1727204309.18152: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204309.18155: Calling groups_plugins_play to load vars for managed-node2 49915 1727204309.21539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204309.24738: done with get_vars() 49915 1727204309.24768: done getting variables 49915 1727204309.24834: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:58:29 -0400 (0:00:00.102) 0:00:15.954 ***** 49915 1727204309.24869: entering _queue_task() for managed-node2/copy 49915 1727204309.25626: worker is 1 (out of 1 available) 49915 1727204309.25638: exiting _queue_task() for managed-node2/copy 49915 1727204309.25649: done queuing things up, now waiting for results queue to drain 49915 1727204309.25651: waiting for pending results... 49915 1727204309.26394: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 49915 1727204309.26399: in run() - task 028d2410-947f-dcd7-b5af-000000000025 49915 1727204309.26403: variable 'ansible_search_path' from source: unknown 49915 1727204309.26520: variable 'ansible_search_path' from source: unknown 49915 1727204309.26530: calling self._execute() 49915 1727204309.26625: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204309.26851: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204309.26854: variable 'omit' from source: magic vars 49915 1727204309.27555: variable 'ansible_distribution_major_version' from source: facts 49915 1727204309.27572: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204309.27773: variable 'network_provider' from source: set_fact 49915 1727204309.27836: Evaluated conditional (network_provider == "initscripts"): False 49915 1727204309.27980: when evaluation is False, skipping this task 49915 1727204309.27983: _execute() done 49915 1727204309.27985: dumping result to json 49915 1727204309.27987: done dumping result, returning 49915 1727204309.27990: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [028d2410-947f-dcd7-b5af-000000000025] 49915 1727204309.27992: sending task result for task 028d2410-947f-dcd7-b5af-000000000025 49915 1727204309.28119: done sending task result for task 028d2410-947f-dcd7-b5af-000000000025 49915 1727204309.28123: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 49915 1727204309.28199: no more pending results, returning what we have 49915 1727204309.28203: results queue empty 49915 1727204309.28204: checking for any_errors_fatal 49915 1727204309.28211: done checking for any_errors_fatal 49915 1727204309.28214: checking for max_fail_percentage 49915 1727204309.28217: done checking for max_fail_percentage 49915 1727204309.28218: checking to see if all hosts have failed and the running result is not ok 49915 1727204309.28219: done checking to see if all hosts have failed 49915 1727204309.28219: getting the remaining hosts for this loop 49915 1727204309.28221: done getting the remaining hosts for this loop 49915 1727204309.28225: getting the next task for host managed-node2 49915 1727204309.28232: done getting next task for host managed-node2 49915 1727204309.28238: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 49915 1727204309.28241: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204309.28257: getting variables 49915 1727204309.28259: in VariableManager get_vars() 49915 1727204309.28307: Calling all_inventory to load vars for managed-node2 49915 1727204309.28310: Calling groups_inventory to load vars for managed-node2 49915 1727204309.28315: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204309.28327: Calling all_plugins_play to load vars for managed-node2 49915 1727204309.28330: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204309.28333: Calling groups_plugins_play to load vars for managed-node2 49915 1727204309.31445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204309.34579: done with get_vars() 49915 1727204309.34611: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:58:29 -0400 (0:00:00.098) 0:00:16.053 ***** 49915 1727204309.34765: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 49915 1727204309.34767: Creating lock for fedora.linux_system_roles.network_connections 49915 1727204309.35675: worker is 1 (out of 1 available) 49915 1727204309.35689: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 49915 1727204309.35701: done queuing things up, now waiting for results queue to drain 49915 1727204309.35702: waiting for pending results... 49915 1727204309.36394: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 49915 1727204309.36530: in run() - task 028d2410-947f-dcd7-b5af-000000000026 49915 1727204309.36544: variable 'ansible_search_path' from source: unknown 49915 1727204309.36548: variable 'ansible_search_path' from source: unknown 49915 1727204309.36629: calling self._execute() 49915 1727204309.36721: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204309.36841: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204309.36852: variable 'omit' from source: magic vars 49915 1727204309.37653: variable 'ansible_distribution_major_version' from source: facts 49915 1727204309.37682: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204309.37688: variable 'omit' from source: magic vars 49915 1727204309.37790: variable 'omit' from source: magic vars 49915 1727204309.38283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49915 1727204309.43281: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49915 1727204309.43394: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49915 1727204309.43434: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49915 1727204309.43466: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49915 1727204309.43495: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49915 1727204309.43579: variable 'network_provider' from source: set_fact 49915 1727204309.43747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204309.43791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204309.43820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204309.43859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204309.43954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204309.43958: variable 'omit' from source: magic vars 49915 1727204309.44064: variable 'omit' from source: magic vars 49915 1727204309.44162: variable 'network_connections' from source: task vars 49915 1727204309.44174: variable 'interface' from source: play vars 49915 1727204309.44283: variable 'interface' from source: play vars 49915 1727204309.44287: variable 'vlan_interface' from source: play vars 49915 1727204309.44310: variable 'vlan_interface' from source: play vars 49915 1727204309.44343: variable 'interface' from source: play vars 49915 1727204309.44565: variable 'interface' from source: play vars 49915 1727204309.44569: variable 'omit' from source: magic vars 49915 1727204309.44572: variable '__lsr_ansible_managed' from source: task vars 49915 1727204309.44754: variable '__lsr_ansible_managed' from source: task vars 49915 1727204309.45248: Loaded config def from plugin (lookup/template) 49915 1727204309.45252: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 49915 1727204309.45280: File lookup term: get_ansible_managed.j2 49915 1727204309.45284: variable 'ansible_search_path' from source: unknown 49915 1727204309.45288: evaluation_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 49915 1727204309.45423: search_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 49915 1727204309.45439: variable 'ansible_search_path' from source: unknown 49915 1727204309.54885: variable 'ansible_managed' from source: unknown 49915 1727204309.55208: variable 'omit' from source: magic vars 49915 1727204309.55393: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204309.55425: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204309.55439: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204309.55456: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204309.55467: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204309.55528: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204309.55531: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204309.55533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204309.55686: Set connection var ansible_connection to ssh 49915 1727204309.55690: Set connection var ansible_shell_type to sh 49915 1727204309.55701: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204309.55712: Set connection var ansible_shell_executable to /bin/sh 49915 1727204309.55720: Set connection var ansible_timeout to 10 49915 1727204309.55727: Set connection var ansible_pipelining to False 49915 1727204309.55761: variable 'ansible_shell_executable' from source: unknown 49915 1727204309.55764: variable 'ansible_connection' from source: unknown 49915 1727204309.55768: variable 'ansible_module_compression' from source: unknown 49915 1727204309.55770: variable 'ansible_shell_type' from source: unknown 49915 1727204309.55772: variable 'ansible_shell_executable' from source: unknown 49915 1727204309.55775: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204309.55779: variable 'ansible_pipelining' from source: unknown 49915 1727204309.55782: variable 'ansible_timeout' from source: unknown 49915 1727204309.55854: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204309.55927: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 49915 1727204309.55936: variable 'omit' from source: magic vars 49915 1727204309.55944: starting attempt loop 49915 1727204309.55948: running the handler 49915 1727204309.55962: _low_level_execute_command(): starting 49915 1727204309.55965: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49915 1727204309.56471: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204309.56480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204309.56505: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration <<< 49915 1727204309.56509: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 49915 1727204309.56511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204309.56557: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204309.56566: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204309.56657: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204309.58377: stdout chunk (state=3): >>>/root <<< 49915 1727204309.58467: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204309.58503: stderr chunk (state=3): >>><<< 49915 1727204309.58506: stdout chunk (state=3): >>><<< 49915 1727204309.58563: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204309.58566: _low_level_execute_command(): starting 49915 1727204309.58571: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204309.585213-51074-251770900882760 `" && echo ansible-tmp-1727204309.585213-51074-251770900882760="` echo /root/.ansible/tmp/ansible-tmp-1727204309.585213-51074-251770900882760 `" ) && sleep 0' 49915 1727204309.58982: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204309.58985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 49915 1727204309.58988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 49915 1727204309.58990: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204309.58992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204309.59048: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204309.59072: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204309.59149: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204309.61071: stdout chunk (state=3): >>>ansible-tmp-1727204309.585213-51074-251770900882760=/root/.ansible/tmp/ansible-tmp-1727204309.585213-51074-251770900882760 <<< 49915 1727204309.61178: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204309.61209: stderr chunk (state=3): >>><<< 49915 1727204309.61211: stdout chunk (state=3): >>><<< 49915 1727204309.61254: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204309.585213-51074-251770900882760=/root/.ansible/tmp/ansible-tmp-1727204309.585213-51074-251770900882760 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204309.61266: variable 'ansible_module_compression' from source: unknown 49915 1727204309.61306: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 49915 1727204309.61309: ANSIBALLZ: Acquiring lock 49915 1727204309.61312: ANSIBALLZ: Lock acquired: 140698006072944 49915 1727204309.61314: ANSIBALLZ: Creating module 49915 1727204309.79471: ANSIBALLZ: Writing module into payload 49915 1727204309.79712: ANSIBALLZ: Writing module 49915 1727204309.79763: ANSIBALLZ: Renaming module 49915 1727204309.79767: ANSIBALLZ: Done creating module 49915 1727204309.79770: variable 'ansible_facts' from source: unknown 49915 1727204309.79868: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204309.585213-51074-251770900882760/AnsiballZ_network_connections.py 49915 1727204309.79989: Sending initial data 49915 1727204309.79993: Sent initial data (167 bytes) 49915 1727204309.80470: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204309.80489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204309.80493: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204309.80524: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204309.80527: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 49915 1727204309.80530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204309.80582: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204309.80598: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204309.80688: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204309.82352: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 49915 1727204309.82355: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49915 1727204309.82421: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49915 1727204309.82500: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49915ogiz3nec/tmpfy3zzufe /root/.ansible/tmp/ansible-tmp-1727204309.585213-51074-251770900882760/AnsiballZ_network_connections.py <<< 49915 1727204309.82503: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204309.585213-51074-251770900882760/AnsiballZ_network_connections.py" <<< 49915 1727204309.82569: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49915ogiz3nec/tmpfy3zzufe" to remote "/root/.ansible/tmp/ansible-tmp-1727204309.585213-51074-251770900882760/AnsiballZ_network_connections.py" <<< 49915 1727204309.82577: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204309.585213-51074-251770900882760/AnsiballZ_network_connections.py" <<< 49915 1727204309.83484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204309.83528: stderr chunk (state=3): >>><<< 49915 1727204309.83532: stdout chunk (state=3): >>><<< 49915 1727204309.83560: done transferring module to remote 49915 1727204309.83569: _low_level_execute_command(): starting 49915 1727204309.83574: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204309.585213-51074-251770900882760/ /root/.ansible/tmp/ansible-tmp-1727204309.585213-51074-251770900882760/AnsiballZ_network_connections.py && sleep 0' 49915 1727204309.84037: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204309.84040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 49915 1727204309.84042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204309.84044: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 49915 1727204309.84046: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204309.84048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204309.84098: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204309.84105: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204309.84177: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204309.85971: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204309.86011: stderr chunk (state=3): >>><<< 49915 1727204309.86017: stdout chunk (state=3): >>><<< 49915 1727204309.86020: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204309.86025: _low_level_execute_command(): starting 49915 1727204309.86030: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204309.585213-51074-251770900882760/AnsiballZ_network_connections.py && sleep 0' 49915 1727204309.86972: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204309.87032: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204309.87097: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204309.87136: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204309.87149: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204309.87206: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204309.87293: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204310.22937: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[005] #0, state:up persistent_state:present, 'lsr101': add connection lsr101, 2409d17b-b636-4a3f-a5ef-a537c98e999e\n[006] #1, state:up persistent_state:present, 'lsr101.90': add connection lsr101.90, 6c870b68-6dd2-4763-9564-574ea4efb444\n[007] #0, state:up persistent_state:present, 'lsr101': up connection lsr101, 2409d17b-b636-4a3f-a5ef-a537c98e999e (not-active)\n[008] #1, state:up persistent_state:present, 'lsr101.90': up connection lsr101.90, 6c870b68-6dd2-4763-9564-574ea4efb444 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "type": "ethernet", "state": "up", "mtu": 1492, "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}, {"name": "lsr101.90", "parent": "lsr101", "type": "vlan", "vlan_id": 90, "mtu": 1280, "state": "up", "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "type": "ethernet", "state": "up", "mtu": 1492, "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}, {"name": "lsr101.90", "parent": "lsr101", "type": "vlan", "vlan_id": 90, "mtu": 1280, "state": "up", "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 49915 1727204310.26082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 49915 1727204310.26087: stdout chunk (state=3): >>><<< 49915 1727204310.26089: stderr chunk (state=3): >>><<< 49915 1727204310.26092: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[005] #0, state:up persistent_state:present, 'lsr101': add connection lsr101, 2409d17b-b636-4a3f-a5ef-a537c98e999e\n[006] #1, state:up persistent_state:present, 'lsr101.90': add connection lsr101.90, 6c870b68-6dd2-4763-9564-574ea4efb444\n[007] #0, state:up persistent_state:present, 'lsr101': up connection lsr101, 2409d17b-b636-4a3f-a5ef-a537c98e999e (not-active)\n[008] #1, state:up persistent_state:present, 'lsr101.90': up connection lsr101.90, 6c870b68-6dd2-4763-9564-574ea4efb444 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "type": "ethernet", "state": "up", "mtu": 1492, "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}, {"name": "lsr101.90", "parent": "lsr101", "type": "vlan", "vlan_id": 90, "mtu": 1280, "state": "up", "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "type": "ethernet", "state": "up", "mtu": 1492, "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}, {"name": "lsr101.90", "parent": "lsr101", "type": "vlan", "vlan_id": 90, "mtu": 1280, "state": "up", "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 49915 1727204310.26136: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr101', 'type': 'ethernet', 'state': 'up', 'mtu': 1492, 'autoconnect': False, 'ip': {'dhcp4': False, 'auto6': False}}, {'name': 'lsr101.90', 'parent': 'lsr101', 'type': 'vlan', 'vlan_id': 90, 'mtu': 1280, 'state': 'up', 'autoconnect': False, 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204309.585213-51074-251770900882760/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49915 1727204310.26188: _low_level_execute_command(): starting 49915 1727204310.26199: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204309.585213-51074-251770900882760/ > /dev/null 2>&1 && sleep 0' 49915 1727204310.27592: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204310.27802: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204310.27867: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204310.27971: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204310.29873: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204310.29923: stderr chunk (state=3): >>><<< 49915 1727204310.30047: stdout chunk (state=3): >>><<< 49915 1727204310.30065: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204310.30072: handler run complete 49915 1727204310.30121: attempt loop complete, returning result 49915 1727204310.30124: _execute() done 49915 1727204310.30126: dumping result to json 49915 1727204310.30132: done dumping result, returning 49915 1727204310.30142: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [028d2410-947f-dcd7-b5af-000000000026] 49915 1727204310.30536: sending task result for task 028d2410-947f-dcd7-b5af-000000000026 49915 1727204310.30608: done sending task result for task 028d2410-947f-dcd7-b5af-000000000026 49915 1727204310.30611: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "mtu": 1492, "name": "lsr101", "state": "up", "type": "ethernet" }, { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "mtu": 1280, "name": "lsr101.90", "parent": "lsr101", "state": "up", "type": "vlan", "vlan_id": 90 } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [005] #0, state:up persistent_state:present, 'lsr101': add connection lsr101, 2409d17b-b636-4a3f-a5ef-a537c98e999e [006] #1, state:up persistent_state:present, 'lsr101.90': add connection lsr101.90, 6c870b68-6dd2-4763-9564-574ea4efb444 [007] #0, state:up persistent_state:present, 'lsr101': up connection lsr101, 2409d17b-b636-4a3f-a5ef-a537c98e999e (not-active) [008] #1, state:up persistent_state:present, 'lsr101.90': up connection lsr101.90, 6c870b68-6dd2-4763-9564-574ea4efb444 (not-active) 49915 1727204310.30767: no more pending results, returning what we have 49915 1727204310.30771: results queue empty 49915 1727204310.30772: checking for any_errors_fatal 49915 1727204310.30783: done checking for any_errors_fatal 49915 1727204310.30784: checking for max_fail_percentage 49915 1727204310.30787: done checking for max_fail_percentage 49915 1727204310.30788: checking to see if all hosts have failed and the running result is not ok 49915 1727204310.30789: done checking to see if all hosts have failed 49915 1727204310.30790: getting the remaining hosts for this loop 49915 1727204310.30792: done getting the remaining hosts for this loop 49915 1727204310.30796: getting the next task for host managed-node2 49915 1727204310.30802: done getting next task for host managed-node2 49915 1727204310.30806: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 49915 1727204310.30809: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204310.30822: getting variables 49915 1727204310.30823: in VariableManager get_vars() 49915 1727204310.30861: Calling all_inventory to load vars for managed-node2 49915 1727204310.30864: Calling groups_inventory to load vars for managed-node2 49915 1727204310.30866: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204310.30875: Calling all_plugins_play to load vars for managed-node2 49915 1727204310.31272: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204310.31278: Calling groups_plugins_play to load vars for managed-node2 49915 1727204310.33278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204310.35861: done with get_vars() 49915 1727204310.35891: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:58:30 -0400 (0:00:01.012) 0:00:17.066 ***** 49915 1727204310.35983: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 49915 1727204310.35985: Creating lock for fedora.linux_system_roles.network_state 49915 1727204310.36345: worker is 1 (out of 1 available) 49915 1727204310.36364: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 49915 1727204310.36382: done queuing things up, now waiting for results queue to drain 49915 1727204310.36383: waiting for pending results... 49915 1727204310.36616: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 49915 1727204310.36870: in run() - task 028d2410-947f-dcd7-b5af-000000000027 49915 1727204310.36873: variable 'ansible_search_path' from source: unknown 49915 1727204310.36878: variable 'ansible_search_path' from source: unknown 49915 1727204310.36881: calling self._execute() 49915 1727204310.36924: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204310.36936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204310.36950: variable 'omit' from source: magic vars 49915 1727204310.37334: variable 'ansible_distribution_major_version' from source: facts 49915 1727204310.37350: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204310.37470: variable 'network_state' from source: role '' defaults 49915 1727204310.37566: Evaluated conditional (network_state != {}): False 49915 1727204310.37569: when evaluation is False, skipping this task 49915 1727204310.37573: _execute() done 49915 1727204310.37576: dumping result to json 49915 1727204310.37579: done dumping result, returning 49915 1727204310.37587: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [028d2410-947f-dcd7-b5af-000000000027] 49915 1727204310.37593: sending task result for task 028d2410-947f-dcd7-b5af-000000000027 49915 1727204310.37719: done sending task result for task 028d2410-947f-dcd7-b5af-000000000027 49915 1727204310.37723: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 49915 1727204310.37790: no more pending results, returning what we have 49915 1727204310.37795: results queue empty 49915 1727204310.37796: checking for any_errors_fatal 49915 1727204310.37810: done checking for any_errors_fatal 49915 1727204310.37811: checking for max_fail_percentage 49915 1727204310.37813: done checking for max_fail_percentage 49915 1727204310.37813: checking to see if all hosts have failed and the running result is not ok 49915 1727204310.37815: done checking to see if all hosts have failed 49915 1727204310.37815: getting the remaining hosts for this loop 49915 1727204310.37817: done getting the remaining hosts for this loop 49915 1727204310.37822: getting the next task for host managed-node2 49915 1727204310.37830: done getting next task for host managed-node2 49915 1727204310.37835: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 49915 1727204310.37839: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204310.37855: getting variables 49915 1727204310.37857: in VariableManager get_vars() 49915 1727204310.37906: Calling all_inventory to load vars for managed-node2 49915 1727204310.37910: Calling groups_inventory to load vars for managed-node2 49915 1727204310.37912: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204310.37924: Calling all_plugins_play to load vars for managed-node2 49915 1727204310.37927: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204310.37930: Calling groups_plugins_play to load vars for managed-node2 49915 1727204310.40692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204310.41705: done with get_vars() 49915 1727204310.41724: done getting variables 49915 1727204310.41770: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:58:30 -0400 (0:00:00.058) 0:00:17.124 ***** 49915 1727204310.41797: entering _queue_task() for managed-node2/debug 49915 1727204310.42059: worker is 1 (out of 1 available) 49915 1727204310.42072: exiting _queue_task() for managed-node2/debug 49915 1727204310.42086: done queuing things up, now waiting for results queue to drain 49915 1727204310.42088: waiting for pending results... 49915 1727204310.42344: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 49915 1727204310.42686: in run() - task 028d2410-947f-dcd7-b5af-000000000028 49915 1727204310.42690: variable 'ansible_search_path' from source: unknown 49915 1727204310.42692: variable 'ansible_search_path' from source: unknown 49915 1727204310.42699: calling self._execute() 49915 1727204310.42702: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204310.42707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204310.42709: variable 'omit' from source: magic vars 49915 1727204310.43064: variable 'ansible_distribution_major_version' from source: facts 49915 1727204310.43081: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204310.43093: variable 'omit' from source: magic vars 49915 1727204310.43192: variable 'omit' from source: magic vars 49915 1727204310.43243: variable 'omit' from source: magic vars 49915 1727204310.43308: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204310.43348: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204310.43388: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204310.43415: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204310.43434: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204310.43588: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204310.43591: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204310.43594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204310.43604: Set connection var ansible_connection to ssh 49915 1727204310.43615: Set connection var ansible_shell_type to sh 49915 1727204310.43629: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204310.43643: Set connection var ansible_shell_executable to /bin/sh 49915 1727204310.43652: Set connection var ansible_timeout to 10 49915 1727204310.43664: Set connection var ansible_pipelining to False 49915 1727204310.43696: variable 'ansible_shell_executable' from source: unknown 49915 1727204310.43714: variable 'ansible_connection' from source: unknown 49915 1727204310.43724: variable 'ansible_module_compression' from source: unknown 49915 1727204310.43805: variable 'ansible_shell_type' from source: unknown 49915 1727204310.43808: variable 'ansible_shell_executable' from source: unknown 49915 1727204310.43814: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204310.43816: variable 'ansible_pipelining' from source: unknown 49915 1727204310.43818: variable 'ansible_timeout' from source: unknown 49915 1727204310.43916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204310.44389: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204310.44394: variable 'omit' from source: magic vars 49915 1727204310.44396: starting attempt loop 49915 1727204310.44398: running the handler 49915 1727204310.44558: variable '__network_connections_result' from source: set_fact 49915 1727204310.44652: handler run complete 49915 1727204310.44678: attempt loop complete, returning result 49915 1727204310.44686: _execute() done 49915 1727204310.44694: dumping result to json 49915 1727204310.44716: done dumping result, returning 49915 1727204310.44781: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [028d2410-947f-dcd7-b5af-000000000028] 49915 1727204310.44785: sending task result for task 028d2410-947f-dcd7-b5af-000000000028 49915 1727204310.44865: done sending task result for task 028d2410-947f-dcd7-b5af-000000000028 49915 1727204310.44868: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "[005] #0, state:up persistent_state:present, 'lsr101': add connection lsr101, 2409d17b-b636-4a3f-a5ef-a537c98e999e", "[006] #1, state:up persistent_state:present, 'lsr101.90': add connection lsr101.90, 6c870b68-6dd2-4763-9564-574ea4efb444", "[007] #0, state:up persistent_state:present, 'lsr101': up connection lsr101, 2409d17b-b636-4a3f-a5ef-a537c98e999e (not-active)", "[008] #1, state:up persistent_state:present, 'lsr101.90': up connection lsr101.90, 6c870b68-6dd2-4763-9564-574ea4efb444 (not-active)" ] } 49915 1727204310.44944: no more pending results, returning what we have 49915 1727204310.44947: results queue empty 49915 1727204310.44948: checking for any_errors_fatal 49915 1727204310.44957: done checking for any_errors_fatal 49915 1727204310.44957: checking for max_fail_percentage 49915 1727204310.44959: done checking for max_fail_percentage 49915 1727204310.44960: checking to see if all hosts have failed and the running result is not ok 49915 1727204310.44961: done checking to see if all hosts have failed 49915 1727204310.44961: getting the remaining hosts for this loop 49915 1727204310.44963: done getting the remaining hosts for this loop 49915 1727204310.44967: getting the next task for host managed-node2 49915 1727204310.44974: done getting next task for host managed-node2 49915 1727204310.44979: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 49915 1727204310.44982: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204310.44993: getting variables 49915 1727204310.44994: in VariableManager get_vars() 49915 1727204310.45033: Calling all_inventory to load vars for managed-node2 49915 1727204310.45036: Calling groups_inventory to load vars for managed-node2 49915 1727204310.45039: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204310.45049: Calling all_plugins_play to load vars for managed-node2 49915 1727204310.45052: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204310.45055: Calling groups_plugins_play to load vars for managed-node2 49915 1727204310.45897: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204310.47900: done with get_vars() 49915 1727204310.47929: done getting variables 49915 1727204310.47985: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:58:30 -0400 (0:00:00.062) 0:00:17.186 ***** 49915 1727204310.48035: entering _queue_task() for managed-node2/debug 49915 1727204310.48427: worker is 1 (out of 1 available) 49915 1727204310.48440: exiting _queue_task() for managed-node2/debug 49915 1727204310.48453: done queuing things up, now waiting for results queue to drain 49915 1727204310.48454: waiting for pending results... 49915 1727204310.48897: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 49915 1727204310.48904: in run() - task 028d2410-947f-dcd7-b5af-000000000029 49915 1727204310.48930: variable 'ansible_search_path' from source: unknown 49915 1727204310.48937: variable 'ansible_search_path' from source: unknown 49915 1727204310.48979: calling self._execute() 49915 1727204310.49085: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204310.49107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204310.49117: variable 'omit' from source: magic vars 49915 1727204310.49418: variable 'ansible_distribution_major_version' from source: facts 49915 1727204310.49432: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204310.49435: variable 'omit' from source: magic vars 49915 1727204310.49473: variable 'omit' from source: magic vars 49915 1727204310.49514: variable 'omit' from source: magic vars 49915 1727204310.49547: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204310.49575: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204310.49592: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204310.49605: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204310.49617: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204310.49640: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204310.49644: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204310.49646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204310.49717: Set connection var ansible_connection to ssh 49915 1727204310.49720: Set connection var ansible_shell_type to sh 49915 1727204310.49724: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204310.49731: Set connection var ansible_shell_executable to /bin/sh 49915 1727204310.49736: Set connection var ansible_timeout to 10 49915 1727204310.49742: Set connection var ansible_pipelining to False 49915 1727204310.49763: variable 'ansible_shell_executable' from source: unknown 49915 1727204310.49766: variable 'ansible_connection' from source: unknown 49915 1727204310.49769: variable 'ansible_module_compression' from source: unknown 49915 1727204310.49771: variable 'ansible_shell_type' from source: unknown 49915 1727204310.49774: variable 'ansible_shell_executable' from source: unknown 49915 1727204310.49777: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204310.49780: variable 'ansible_pipelining' from source: unknown 49915 1727204310.49782: variable 'ansible_timeout' from source: unknown 49915 1727204310.49784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204310.49890: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204310.49901: variable 'omit' from source: magic vars 49915 1727204310.49904: starting attempt loop 49915 1727204310.49908: running the handler 49915 1727204310.49946: variable '__network_connections_result' from source: set_fact 49915 1727204310.50030: variable '__network_connections_result' from source: set_fact 49915 1727204310.50285: handler run complete 49915 1727204310.50288: attempt loop complete, returning result 49915 1727204310.50291: _execute() done 49915 1727204310.50292: dumping result to json 49915 1727204310.50295: done dumping result, returning 49915 1727204310.50297: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [028d2410-947f-dcd7-b5af-000000000029] 49915 1727204310.50299: sending task result for task 028d2410-947f-dcd7-b5af-000000000029 ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "mtu": 1492, "name": "lsr101", "state": "up", "type": "ethernet" }, { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "mtu": 1280, "name": "lsr101.90", "parent": "lsr101", "state": "up", "type": "vlan", "vlan_id": 90 } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[005] #0, state:up persistent_state:present, 'lsr101': add connection lsr101, 2409d17b-b636-4a3f-a5ef-a537c98e999e\n[006] #1, state:up persistent_state:present, 'lsr101.90': add connection lsr101.90, 6c870b68-6dd2-4763-9564-574ea4efb444\n[007] #0, state:up persistent_state:present, 'lsr101': up connection lsr101, 2409d17b-b636-4a3f-a5ef-a537c98e999e (not-active)\n[008] #1, state:up persistent_state:present, 'lsr101.90': up connection lsr101.90, 6c870b68-6dd2-4763-9564-574ea4efb444 (not-active)\n", "stderr_lines": [ "[005] #0, state:up persistent_state:present, 'lsr101': add connection lsr101, 2409d17b-b636-4a3f-a5ef-a537c98e999e", "[006] #1, state:up persistent_state:present, 'lsr101.90': add connection lsr101.90, 6c870b68-6dd2-4763-9564-574ea4efb444", "[007] #0, state:up persistent_state:present, 'lsr101': up connection lsr101, 2409d17b-b636-4a3f-a5ef-a537c98e999e (not-active)", "[008] #1, state:up persistent_state:present, 'lsr101.90': up connection lsr101.90, 6c870b68-6dd2-4763-9564-574ea4efb444 (not-active)" ] } } 49915 1727204310.50477: no more pending results, returning what we have 49915 1727204310.50481: results queue empty 49915 1727204310.50482: checking for any_errors_fatal 49915 1727204310.50487: done checking for any_errors_fatal 49915 1727204310.50488: checking for max_fail_percentage 49915 1727204310.50490: done checking for max_fail_percentage 49915 1727204310.50491: checking to see if all hosts have failed and the running result is not ok 49915 1727204310.50492: done checking to see if all hosts have failed 49915 1727204310.50493: getting the remaining hosts for this loop 49915 1727204310.50495: done getting the remaining hosts for this loop 49915 1727204310.50499: getting the next task for host managed-node2 49915 1727204310.50506: done getting next task for host managed-node2 49915 1727204310.50510: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 49915 1727204310.50516: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204310.50534: getting variables 49915 1727204310.50536: in VariableManager get_vars() 49915 1727204310.50694: Calling all_inventory to load vars for managed-node2 49915 1727204310.50698: Calling groups_inventory to load vars for managed-node2 49915 1727204310.50700: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204310.50707: done sending task result for task 028d2410-947f-dcd7-b5af-000000000029 49915 1727204310.50709: WORKER PROCESS EXITING 49915 1727204310.50721: Calling all_plugins_play to load vars for managed-node2 49915 1727204310.50724: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204310.50726: Calling groups_plugins_play to load vars for managed-node2 49915 1727204310.52568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204310.54352: done with get_vars() 49915 1727204310.54446: done getting variables 49915 1727204310.54524: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:58:30 -0400 (0:00:00.065) 0:00:17.251 ***** 49915 1727204310.54564: entering _queue_task() for managed-node2/debug 49915 1727204310.55224: worker is 1 (out of 1 available) 49915 1727204310.55237: exiting _queue_task() for managed-node2/debug 49915 1727204310.55250: done queuing things up, now waiting for results queue to drain 49915 1727204310.55252: waiting for pending results... 49915 1727204310.55662: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 49915 1727204310.55887: in run() - task 028d2410-947f-dcd7-b5af-00000000002a 49915 1727204310.55902: variable 'ansible_search_path' from source: unknown 49915 1727204310.55907: variable 'ansible_search_path' from source: unknown 49915 1727204310.55977: calling self._execute() 49915 1727204310.56070: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204310.56080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204310.56086: variable 'omit' from source: magic vars 49915 1727204310.56450: variable 'ansible_distribution_major_version' from source: facts 49915 1727204310.56464: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204310.56579: variable 'network_state' from source: role '' defaults 49915 1727204310.56589: Evaluated conditional (network_state != {}): False 49915 1727204310.56592: when evaluation is False, skipping this task 49915 1727204310.56595: _execute() done 49915 1727204310.56598: dumping result to json 49915 1727204310.56600: done dumping result, returning 49915 1727204310.56633: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [028d2410-947f-dcd7-b5af-00000000002a] 49915 1727204310.56636: sending task result for task 028d2410-947f-dcd7-b5af-00000000002a 49915 1727204310.56703: done sending task result for task 028d2410-947f-dcd7-b5af-00000000002a 49915 1727204310.56707: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 49915 1727204310.56782: no more pending results, returning what we have 49915 1727204310.56786: results queue empty 49915 1727204310.56788: checking for any_errors_fatal 49915 1727204310.56797: done checking for any_errors_fatal 49915 1727204310.56798: checking for max_fail_percentage 49915 1727204310.56800: done checking for max_fail_percentage 49915 1727204310.56801: checking to see if all hosts have failed and the running result is not ok 49915 1727204310.56802: done checking to see if all hosts have failed 49915 1727204310.56803: getting the remaining hosts for this loop 49915 1727204310.56805: done getting the remaining hosts for this loop 49915 1727204310.56809: getting the next task for host managed-node2 49915 1727204310.56849: done getting next task for host managed-node2 49915 1727204310.56854: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 49915 1727204310.56857: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204310.56869: getting variables 49915 1727204310.56871: in VariableManager get_vars() 49915 1727204310.56906: Calling all_inventory to load vars for managed-node2 49915 1727204310.56909: Calling groups_inventory to load vars for managed-node2 49915 1727204310.56911: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204310.56922: Calling all_plugins_play to load vars for managed-node2 49915 1727204310.56963: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204310.56967: Calling groups_plugins_play to load vars for managed-node2 49915 1727204310.58743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204310.59831: done with get_vars() 49915 1727204310.59850: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:58:30 -0400 (0:00:00.053) 0:00:17.305 ***** 49915 1727204310.59929: entering _queue_task() for managed-node2/ping 49915 1727204310.59930: Creating lock for ping 49915 1727204310.60194: worker is 1 (out of 1 available) 49915 1727204310.60207: exiting _queue_task() for managed-node2/ping 49915 1727204310.60222: done queuing things up, now waiting for results queue to drain 49915 1727204310.60223: waiting for pending results... 49915 1727204310.60405: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 49915 1727204310.60489: in run() - task 028d2410-947f-dcd7-b5af-00000000002b 49915 1727204310.60500: variable 'ansible_search_path' from source: unknown 49915 1727204310.60504: variable 'ansible_search_path' from source: unknown 49915 1727204310.60532: calling self._execute() 49915 1727204310.60604: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204310.60608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204310.60617: variable 'omit' from source: magic vars 49915 1727204310.60904: variable 'ansible_distribution_major_version' from source: facts 49915 1727204310.60916: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204310.60919: variable 'omit' from source: magic vars 49915 1727204310.60960: variable 'omit' from source: magic vars 49915 1727204310.61085: variable 'omit' from source: magic vars 49915 1727204310.61089: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204310.61092: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204310.61115: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204310.61137: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204310.61153: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204310.61198: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204310.61206: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204310.61208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204310.61304: Set connection var ansible_connection to ssh 49915 1727204310.61312: Set connection var ansible_shell_type to sh 49915 1727204310.61315: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204310.61406: Set connection var ansible_shell_executable to /bin/sh 49915 1727204310.61410: Set connection var ansible_timeout to 10 49915 1727204310.61412: Set connection var ansible_pipelining to False 49915 1727204310.61414: variable 'ansible_shell_executable' from source: unknown 49915 1727204310.61418: variable 'ansible_connection' from source: unknown 49915 1727204310.61420: variable 'ansible_module_compression' from source: unknown 49915 1727204310.61422: variable 'ansible_shell_type' from source: unknown 49915 1727204310.61424: variable 'ansible_shell_executable' from source: unknown 49915 1727204310.61426: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204310.61428: variable 'ansible_pipelining' from source: unknown 49915 1727204310.61430: variable 'ansible_timeout' from source: unknown 49915 1727204310.61432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204310.61662: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 49915 1727204310.61666: variable 'omit' from source: magic vars 49915 1727204310.61669: starting attempt loop 49915 1727204310.61671: running the handler 49915 1727204310.61674: _low_level_execute_command(): starting 49915 1727204310.61679: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49915 1727204310.62256: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204310.62324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204310.62329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204310.62352: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204310.62361: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204310.62367: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204310.62467: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204310.64179: stdout chunk (state=3): >>>/root <<< 49915 1727204310.64264: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204310.64293: stderr chunk (state=3): >>><<< 49915 1727204310.64297: stdout chunk (state=3): >>><<< 49915 1727204310.64318: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204310.64333: _low_level_execute_command(): starting 49915 1727204310.64336: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204310.6431952-51148-104665646829041 `" && echo ansible-tmp-1727204310.6431952-51148-104665646829041="` echo /root/.ansible/tmp/ansible-tmp-1727204310.6431952-51148-104665646829041 `" ) && sleep 0' 49915 1727204310.64734: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204310.64773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204310.64779: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 49915 1727204310.64782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204310.64792: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204310.64794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204310.64825: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204310.64832: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204310.64915: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204310.67083: stdout chunk (state=3): >>>ansible-tmp-1727204310.6431952-51148-104665646829041=/root/.ansible/tmp/ansible-tmp-1727204310.6431952-51148-104665646829041 <<< 49915 1727204310.67130: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204310.67134: stderr chunk (state=3): >>><<< 49915 1727204310.67147: stdout chunk (state=3): >>><<< 49915 1727204310.67322: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204310.6431952-51148-104665646829041=/root/.ansible/tmp/ansible-tmp-1727204310.6431952-51148-104665646829041 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204310.67327: variable 'ansible_module_compression' from source: unknown 49915 1727204310.67329: ANSIBALLZ: Using lock for ping 49915 1727204310.67331: ANSIBALLZ: Acquiring lock 49915 1727204310.67333: ANSIBALLZ: Lock acquired: 140698006277824 49915 1727204310.67335: ANSIBALLZ: Creating module 49915 1727204310.89093: ANSIBALLZ: Writing module into payload 49915 1727204310.89154: ANSIBALLZ: Writing module 49915 1727204310.89173: ANSIBALLZ: Renaming module 49915 1727204310.89181: ANSIBALLZ: Done creating module 49915 1727204310.89199: variable 'ansible_facts' from source: unknown 49915 1727204310.89274: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204310.6431952-51148-104665646829041/AnsiballZ_ping.py 49915 1727204310.89405: Sending initial data 49915 1727204310.89421: Sent initial data (153 bytes) 49915 1727204310.89852: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204310.89869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204310.89882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204310.89928: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204310.89943: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204310.90035: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204310.91710: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49915 1727204310.91772: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49915 1727204310.91853: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49915ogiz3nec/tmpkg73qgi4 /root/.ansible/tmp/ansible-tmp-1727204310.6431952-51148-104665646829041/AnsiballZ_ping.py <<< 49915 1727204310.91857: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204310.6431952-51148-104665646829041/AnsiballZ_ping.py" <<< 49915 1727204310.91914: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49915ogiz3nec/tmpkg73qgi4" to remote "/root/.ansible/tmp/ansible-tmp-1727204310.6431952-51148-104665646829041/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204310.6431952-51148-104665646829041/AnsiballZ_ping.py" <<< 49915 1727204310.92657: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204310.92688: stderr chunk (state=3): >>><<< 49915 1727204310.92700: stdout chunk (state=3): >>><<< 49915 1727204310.92736: done transferring module to remote 49915 1727204310.92754: _low_level_execute_command(): starting 49915 1727204310.92757: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204310.6431952-51148-104665646829041/ /root/.ansible/tmp/ansible-tmp-1727204310.6431952-51148-104665646829041/AnsiballZ_ping.py && sleep 0' 49915 1727204310.93494: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204310.93555: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204310.93571: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204310.93778: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204310.93864: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204310.95698: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204310.95707: stdout chunk (state=3): >>><<< 49915 1727204310.95716: stderr chunk (state=3): >>><<< 49915 1727204310.95732: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204310.95739: _low_level_execute_command(): starting 49915 1727204310.95749: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204310.6431952-51148-104665646829041/AnsiballZ_ping.py && sleep 0' 49915 1727204310.96224: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204310.96229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration <<< 49915 1727204310.96232: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 49915 1727204310.96234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204310.96280: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204310.96283: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204310.96288: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204310.96363: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204311.11437: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 49915 1727204311.12703: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 49915 1727204311.12734: stderr chunk (state=3): >>><<< 49915 1727204311.12737: stdout chunk (state=3): >>><<< 49915 1727204311.12757: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 49915 1727204311.12778: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204310.6431952-51148-104665646829041/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49915 1727204311.12787: _low_level_execute_command(): starting 49915 1727204311.12791: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204310.6431952-51148-104665646829041/ > /dev/null 2>&1 && sleep 0' 49915 1727204311.13244: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204311.13248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 49915 1727204311.13285: stderr chunk (state=3): >>>debug2: match not found <<< 49915 1727204311.13288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 49915 1727204311.13290: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204311.13292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 49915 1727204311.13294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204311.13352: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204311.13356: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204311.13358: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204311.13436: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204311.15301: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204311.15328: stderr chunk (state=3): >>><<< 49915 1727204311.15332: stdout chunk (state=3): >>><<< 49915 1727204311.15351: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204311.15354: handler run complete 49915 1727204311.15371: attempt loop complete, returning result 49915 1727204311.15373: _execute() done 49915 1727204311.15377: dumping result to json 49915 1727204311.15380: done dumping result, returning 49915 1727204311.15389: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [028d2410-947f-dcd7-b5af-00000000002b] 49915 1727204311.15393: sending task result for task 028d2410-947f-dcd7-b5af-00000000002b 49915 1727204311.15485: done sending task result for task 028d2410-947f-dcd7-b5af-00000000002b 49915 1727204311.15488: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 49915 1727204311.15546: no more pending results, returning what we have 49915 1727204311.15549: results queue empty 49915 1727204311.15550: checking for any_errors_fatal 49915 1727204311.15557: done checking for any_errors_fatal 49915 1727204311.15558: checking for max_fail_percentage 49915 1727204311.15559: done checking for max_fail_percentage 49915 1727204311.15560: checking to see if all hosts have failed and the running result is not ok 49915 1727204311.15561: done checking to see if all hosts have failed 49915 1727204311.15562: getting the remaining hosts for this loop 49915 1727204311.15564: done getting the remaining hosts for this loop 49915 1727204311.15567: getting the next task for host managed-node2 49915 1727204311.15578: done getting next task for host managed-node2 49915 1727204311.15581: ^ task is: TASK: meta (role_complete) 49915 1727204311.15584: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204311.15596: getting variables 49915 1727204311.15599: in VariableManager get_vars() 49915 1727204311.15641: Calling all_inventory to load vars for managed-node2 49915 1727204311.15644: Calling groups_inventory to load vars for managed-node2 49915 1727204311.15646: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204311.15655: Calling all_plugins_play to load vars for managed-node2 49915 1727204311.15658: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204311.15660: Calling groups_plugins_play to load vars for managed-node2 49915 1727204311.17247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204311.18118: done with get_vars() 49915 1727204311.18140: done getting variables 49915 1727204311.18204: done queuing things up, now waiting for results queue to drain 49915 1727204311.18207: results queue empty 49915 1727204311.18207: checking for any_errors_fatal 49915 1727204311.18211: done checking for any_errors_fatal 49915 1727204311.18211: checking for max_fail_percentage 49915 1727204311.18212: done checking for max_fail_percentage 49915 1727204311.18213: checking to see if all hosts have failed and the running result is not ok 49915 1727204311.18214: done checking to see if all hosts have failed 49915 1727204311.18214: getting the remaining hosts for this loop 49915 1727204311.18215: done getting the remaining hosts for this loop 49915 1727204311.18217: getting the next task for host managed-node2 49915 1727204311.18220: done getting next task for host managed-node2 49915 1727204311.18222: ^ task is: TASK: Include the task 'assert_device_present.yml' 49915 1727204311.18223: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204311.18225: getting variables 49915 1727204311.18226: in VariableManager get_vars() 49915 1727204311.18238: Calling all_inventory to load vars for managed-node2 49915 1727204311.18239: Calling groups_inventory to load vars for managed-node2 49915 1727204311.18240: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204311.18244: Calling all_plugins_play to load vars for managed-node2 49915 1727204311.18246: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204311.18247: Calling groups_plugins_play to load vars for managed-node2 49915 1727204311.19058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204311.20511: done with get_vars() 49915 1727204311.20539: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:46 Tuesday 24 September 2024 14:58:31 -0400 (0:00:00.606) 0:00:17.912 ***** 49915 1727204311.20618: entering _queue_task() for managed-node2/include_tasks 49915 1727204311.20980: worker is 1 (out of 1 available) 49915 1727204311.20993: exiting _queue_task() for managed-node2/include_tasks 49915 1727204311.21006: done queuing things up, now waiting for results queue to drain 49915 1727204311.21007: waiting for pending results... 49915 1727204311.21396: running TaskExecutor() for managed-node2/TASK: Include the task 'assert_device_present.yml' 49915 1727204311.21401: in run() - task 028d2410-947f-dcd7-b5af-00000000005b 49915 1727204311.21415: variable 'ansible_search_path' from source: unknown 49915 1727204311.21458: calling self._execute() 49915 1727204311.21559: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204311.21571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204311.21587: variable 'omit' from source: magic vars 49915 1727204311.21975: variable 'ansible_distribution_major_version' from source: facts 49915 1727204311.21994: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204311.22005: _execute() done 49915 1727204311.22013: dumping result to json 49915 1727204311.22021: done dumping result, returning 49915 1727204311.22034: done running TaskExecutor() for managed-node2/TASK: Include the task 'assert_device_present.yml' [028d2410-947f-dcd7-b5af-00000000005b] 49915 1727204311.22043: sending task result for task 028d2410-947f-dcd7-b5af-00000000005b 49915 1727204311.22295: done sending task result for task 028d2410-947f-dcd7-b5af-00000000005b 49915 1727204311.22298: WORKER PROCESS EXITING 49915 1727204311.22327: no more pending results, returning what we have 49915 1727204311.22333: in VariableManager get_vars() 49915 1727204311.22385: Calling all_inventory to load vars for managed-node2 49915 1727204311.22388: Calling groups_inventory to load vars for managed-node2 49915 1727204311.22390: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204311.22405: Calling all_plugins_play to load vars for managed-node2 49915 1727204311.22409: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204311.22411: Calling groups_plugins_play to load vars for managed-node2 49915 1727204311.23919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204311.25411: done with get_vars() 49915 1727204311.25436: variable 'ansible_search_path' from source: unknown 49915 1727204311.25454: we have included files to process 49915 1727204311.25455: generating all_blocks data 49915 1727204311.25457: done generating all_blocks data 49915 1727204311.25463: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 49915 1727204311.25464: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 49915 1727204311.25467: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 49915 1727204311.25578: in VariableManager get_vars() 49915 1727204311.25604: done with get_vars() 49915 1727204311.25704: done processing included file 49915 1727204311.25707: iterating over new_blocks loaded from include file 49915 1727204311.25708: in VariableManager get_vars() 49915 1727204311.25725: done with get_vars() 49915 1727204311.25727: filtering new block on tags 49915 1727204311.25745: done filtering new block on tags 49915 1727204311.25747: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed-node2 49915 1727204311.25753: extending task lists for all hosts with included blocks 49915 1727204311.28064: done extending task lists 49915 1727204311.28065: done processing included files 49915 1727204311.28066: results queue empty 49915 1727204311.28067: checking for any_errors_fatal 49915 1727204311.28068: done checking for any_errors_fatal 49915 1727204311.28069: checking for max_fail_percentage 49915 1727204311.28070: done checking for max_fail_percentage 49915 1727204311.28071: checking to see if all hosts have failed and the running result is not ok 49915 1727204311.28072: done checking to see if all hosts have failed 49915 1727204311.28073: getting the remaining hosts for this loop 49915 1727204311.28074: done getting the remaining hosts for this loop 49915 1727204311.28078: getting the next task for host managed-node2 49915 1727204311.28082: done getting next task for host managed-node2 49915 1727204311.28084: ^ task is: TASK: Include the task 'get_interface_stat.yml' 49915 1727204311.28086: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204311.28089: getting variables 49915 1727204311.28090: in VariableManager get_vars() 49915 1727204311.28105: Calling all_inventory to load vars for managed-node2 49915 1727204311.28107: Calling groups_inventory to load vars for managed-node2 49915 1727204311.28109: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204311.28115: Calling all_plugins_play to load vars for managed-node2 49915 1727204311.28118: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204311.28121: Calling groups_plugins_play to load vars for managed-node2 49915 1727204311.29317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204311.30791: done with get_vars() 49915 1727204311.30824: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:58:31 -0400 (0:00:00.102) 0:00:18.015 ***** 49915 1727204311.30913: entering _queue_task() for managed-node2/include_tasks 49915 1727204311.31506: worker is 1 (out of 1 available) 49915 1727204311.31516: exiting _queue_task() for managed-node2/include_tasks 49915 1727204311.31527: done queuing things up, now waiting for results queue to drain 49915 1727204311.31527: waiting for pending results... 49915 1727204311.31656: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 49915 1727204311.31703: in run() - task 028d2410-947f-dcd7-b5af-000000000578 49915 1727204311.31722: variable 'ansible_search_path' from source: unknown 49915 1727204311.31730: variable 'ansible_search_path' from source: unknown 49915 1727204311.31777: calling self._execute() 49915 1727204311.31869: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204311.31883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204311.31969: variable 'omit' from source: magic vars 49915 1727204311.32288: variable 'ansible_distribution_major_version' from source: facts 49915 1727204311.32314: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204311.32326: _execute() done 49915 1727204311.32334: dumping result to json 49915 1727204311.32342: done dumping result, returning 49915 1727204311.32354: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [028d2410-947f-dcd7-b5af-000000000578] 49915 1727204311.32366: sending task result for task 028d2410-947f-dcd7-b5af-000000000578 49915 1727204311.32540: no more pending results, returning what we have 49915 1727204311.32546: in VariableManager get_vars() 49915 1727204311.32598: Calling all_inventory to load vars for managed-node2 49915 1727204311.32601: Calling groups_inventory to load vars for managed-node2 49915 1727204311.32604: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204311.32618: Calling all_plugins_play to load vars for managed-node2 49915 1727204311.32622: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204311.32625: Calling groups_plugins_play to load vars for managed-node2 49915 1727204311.33309: done sending task result for task 028d2410-947f-dcd7-b5af-000000000578 49915 1727204311.33314: WORKER PROCESS EXITING 49915 1727204311.34349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204311.35845: done with get_vars() 49915 1727204311.35870: variable 'ansible_search_path' from source: unknown 49915 1727204311.35872: variable 'ansible_search_path' from source: unknown 49915 1727204311.35912: we have included files to process 49915 1727204311.35914: generating all_blocks data 49915 1727204311.35916: done generating all_blocks data 49915 1727204311.35918: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 49915 1727204311.35919: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 49915 1727204311.35921: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 49915 1727204311.36124: done processing included file 49915 1727204311.36127: iterating over new_blocks loaded from include file 49915 1727204311.36128: in VariableManager get_vars() 49915 1727204311.36151: done with get_vars() 49915 1727204311.36153: filtering new block on tags 49915 1727204311.36170: done filtering new block on tags 49915 1727204311.36173: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 49915 1727204311.36181: extending task lists for all hosts with included blocks 49915 1727204311.36284: done extending task lists 49915 1727204311.36285: done processing included files 49915 1727204311.36286: results queue empty 49915 1727204311.36287: checking for any_errors_fatal 49915 1727204311.36291: done checking for any_errors_fatal 49915 1727204311.36292: checking for max_fail_percentage 49915 1727204311.36293: done checking for max_fail_percentage 49915 1727204311.36294: checking to see if all hosts have failed and the running result is not ok 49915 1727204311.36295: done checking to see if all hosts have failed 49915 1727204311.36295: getting the remaining hosts for this loop 49915 1727204311.36297: done getting the remaining hosts for this loop 49915 1727204311.36299: getting the next task for host managed-node2 49915 1727204311.36304: done getting next task for host managed-node2 49915 1727204311.36307: ^ task is: TASK: Get stat for interface {{ interface }} 49915 1727204311.36310: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204311.36312: getting variables 49915 1727204311.36313: in VariableManager get_vars() 49915 1727204311.36328: Calling all_inventory to load vars for managed-node2 49915 1727204311.36330: Calling groups_inventory to load vars for managed-node2 49915 1727204311.36333: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204311.36339: Calling all_plugins_play to load vars for managed-node2 49915 1727204311.36341: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204311.36344: Calling groups_plugins_play to load vars for managed-node2 49915 1727204311.42924: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204311.46753: done with get_vars() 49915 1727204311.46785: done getting variables 49915 1727204311.47134: variable 'interface' from source: include params 49915 1727204311.47138: variable 'vlan_interface' from source: play vars 49915 1727204311.47203: variable 'vlan_interface' from source: play vars TASK [Get stat for interface lsr101.90] **************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:58:31 -0400 (0:00:00.163) 0:00:18.178 ***** 49915 1727204311.47230: entering _queue_task() for managed-node2/stat 49915 1727204311.47984: worker is 1 (out of 1 available) 49915 1727204311.47996: exiting _queue_task() for managed-node2/stat 49915 1727204311.48009: done queuing things up, now waiting for results queue to drain 49915 1727204311.48011: waiting for pending results... 49915 1727204311.48552: running TaskExecutor() for managed-node2/TASK: Get stat for interface lsr101.90 49915 1727204311.48906: in run() - task 028d2410-947f-dcd7-b5af-00000000069c 49915 1727204311.48910: variable 'ansible_search_path' from source: unknown 49915 1727204311.48913: variable 'ansible_search_path' from source: unknown 49915 1727204311.48941: calling self._execute() 49915 1727204311.49081: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204311.49247: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204311.49257: variable 'omit' from source: magic vars 49915 1727204311.49955: variable 'ansible_distribution_major_version' from source: facts 49915 1727204311.50057: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204311.50060: variable 'omit' from source: magic vars 49915 1727204311.50063: variable 'omit' from source: magic vars 49915 1727204311.50151: variable 'interface' from source: include params 49915 1727204311.50154: variable 'vlan_interface' from source: play vars 49915 1727204311.50229: variable 'vlan_interface' from source: play vars 49915 1727204311.50250: variable 'omit' from source: magic vars 49915 1727204311.50294: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204311.50339: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204311.50365: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204311.50382: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204311.50391: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204311.50492: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204311.50496: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204311.50499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204311.50600: Set connection var ansible_connection to ssh 49915 1727204311.50604: Set connection var ansible_shell_type to sh 49915 1727204311.50609: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204311.50614: Set connection var ansible_shell_executable to /bin/sh 49915 1727204311.50617: Set connection var ansible_timeout to 10 49915 1727204311.50619: Set connection var ansible_pipelining to False 49915 1727204311.50622: variable 'ansible_shell_executable' from source: unknown 49915 1727204311.50625: variable 'ansible_connection' from source: unknown 49915 1727204311.50628: variable 'ansible_module_compression' from source: unknown 49915 1727204311.50630: variable 'ansible_shell_type' from source: unknown 49915 1727204311.50632: variable 'ansible_shell_executable' from source: unknown 49915 1727204311.50634: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204311.50636: variable 'ansible_pipelining' from source: unknown 49915 1727204311.50638: variable 'ansible_timeout' from source: unknown 49915 1727204311.50641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204311.51010: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 49915 1727204311.51015: variable 'omit' from source: magic vars 49915 1727204311.51018: starting attempt loop 49915 1727204311.51020: running the handler 49915 1727204311.51022: _low_level_execute_command(): starting 49915 1727204311.51024: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49915 1727204311.52085: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204311.52090: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204311.52094: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204311.52097: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204311.52100: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204311.52101: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204311.52104: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204311.53806: stdout chunk (state=3): >>>/root <<< 49915 1727204311.54147: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204311.54152: stdout chunk (state=3): >>><<< 49915 1727204311.54160: stderr chunk (state=3): >>><<< 49915 1727204311.54192: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204311.54294: _low_level_execute_command(): starting 49915 1727204311.54298: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204311.5419083-51185-12110514401545 `" && echo ansible-tmp-1727204311.5419083-51185-12110514401545="` echo /root/.ansible/tmp/ansible-tmp-1727204311.5419083-51185-12110514401545 `" ) && sleep 0' 49915 1727204311.55658: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204311.55797: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 49915 1727204311.55810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204311.56094: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204311.56115: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204311.56218: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204311.58249: stdout chunk (state=3): >>>ansible-tmp-1727204311.5419083-51185-12110514401545=/root/.ansible/tmp/ansible-tmp-1727204311.5419083-51185-12110514401545 <<< 49915 1727204311.58413: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204311.58463: stderr chunk (state=3): >>><<< 49915 1727204311.58467: stdout chunk (state=3): >>><<< 49915 1727204311.58493: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204311.5419083-51185-12110514401545=/root/.ansible/tmp/ansible-tmp-1727204311.5419083-51185-12110514401545 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204311.58545: variable 'ansible_module_compression' from source: unknown 49915 1727204311.58605: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49915ogiz3nec/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 49915 1727204311.58646: variable 'ansible_facts' from source: unknown 49915 1727204311.58949: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204311.5419083-51185-12110514401545/AnsiballZ_stat.py 49915 1727204311.59229: Sending initial data 49915 1727204311.59233: Sent initial data (152 bytes) 49915 1727204311.60698: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204311.60812: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204311.60985: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204311.61059: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204311.62957: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49915 1727204311.62963: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49915 1727204311.63040: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49915ogiz3nec/tmp5j_y5m3s /root/.ansible/tmp/ansible-tmp-1727204311.5419083-51185-12110514401545/AnsiballZ_stat.py <<< 49915 1727204311.63045: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204311.5419083-51185-12110514401545/AnsiballZ_stat.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49915ogiz3nec/tmp5j_y5m3s" to remote "/root/.ansible/tmp/ansible-tmp-1727204311.5419083-51185-12110514401545/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204311.5419083-51185-12110514401545/AnsiballZ_stat.py" <<< 49915 1727204311.64850: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204311.64856: stderr chunk (state=3): >>><<< 49915 1727204311.64892: stdout chunk (state=3): >>><<< 49915 1727204311.64899: done transferring module to remote 49915 1727204311.64952: _low_level_execute_command(): starting 49915 1727204311.64988: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204311.5419083-51185-12110514401545/ /root/.ansible/tmp/ansible-tmp-1727204311.5419083-51185-12110514401545/AnsiballZ_stat.py && sleep 0' 49915 1727204311.66583: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204311.66678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204311.66684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204311.66800: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204311.67137: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204311.67182: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204311.69105: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204311.69117: stderr chunk (state=3): >>><<< 49915 1727204311.69120: stdout chunk (state=3): >>><<< 49915 1727204311.69298: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204311.69302: _low_level_execute_command(): starting 49915 1727204311.69305: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204311.5419083-51185-12110514401545/AnsiballZ_stat.py && sleep 0' 49915 1727204311.70606: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204311.70708: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204311.70834: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204311.86019: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr101.90", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 30583, "dev": 23, "nlink": 1, "atime": 1727204310.1922226, "mtime": 1727204310.1922226, "ctime": 1727204310.1922226, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr101.90", "lnk_target": "../../devices/virtual/net/lsr101.90", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr101.90", "follow": false, "checksum_algorithm": "sha1"}}} <<< 49915 1727204311.87497: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 49915 1727204311.87546: stdout chunk (state=3): >>><<< 49915 1727204311.87550: stderr chunk (state=3): >>><<< 49915 1727204311.87552: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr101.90", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 30583, "dev": 23, "nlink": 1, "atime": 1727204310.1922226, "mtime": 1727204310.1922226, "ctime": 1727204310.1922226, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr101.90", "lnk_target": "../../devices/virtual/net/lsr101.90", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr101.90", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 49915 1727204311.87683: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/lsr101.90', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204311.5419083-51185-12110514401545/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49915 1727204311.87687: _low_level_execute_command(): starting 49915 1727204311.87690: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204311.5419083-51185-12110514401545/ > /dev/null 2>&1 && sleep 0' 49915 1727204311.88953: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204311.88963: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204311.88972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204311.89178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204311.89186: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204311.89189: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204311.89383: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204311.89453: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204311.91490: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204311.91494: stdout chunk (state=3): >>><<< 49915 1727204311.91500: stderr chunk (state=3): >>><<< 49915 1727204311.91540: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204311.91544: handler run complete 49915 1727204311.91594: attempt loop complete, returning result 49915 1727204311.91598: _execute() done 49915 1727204311.91600: dumping result to json 49915 1727204311.91673: done dumping result, returning 49915 1727204311.91678: done running TaskExecutor() for managed-node2/TASK: Get stat for interface lsr101.90 [028d2410-947f-dcd7-b5af-00000000069c] 49915 1727204311.91680: sending task result for task 028d2410-947f-dcd7-b5af-00000000069c 49915 1727204311.91757: done sending task result for task 028d2410-947f-dcd7-b5af-00000000069c 49915 1727204311.91760: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "atime": 1727204310.1922226, "block_size": 4096, "blocks": 0, "ctime": 1727204310.1922226, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 30583, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/lsr101.90", "lnk_target": "../../devices/virtual/net/lsr101.90", "mode": "0777", "mtime": 1727204310.1922226, "nlink": 1, "path": "/sys/class/net/lsr101.90", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 49915 1727204311.91871: no more pending results, returning what we have 49915 1727204311.91878: results queue empty 49915 1727204311.91879: checking for any_errors_fatal 49915 1727204311.91880: done checking for any_errors_fatal 49915 1727204311.91881: checking for max_fail_percentage 49915 1727204311.91883: done checking for max_fail_percentage 49915 1727204311.91884: checking to see if all hosts have failed and the running result is not ok 49915 1727204311.91885: done checking to see if all hosts have failed 49915 1727204311.91886: getting the remaining hosts for this loop 49915 1727204311.91888: done getting the remaining hosts for this loop 49915 1727204311.91893: getting the next task for host managed-node2 49915 1727204311.91901: done getting next task for host managed-node2 49915 1727204311.91903: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 49915 1727204311.91906: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204311.91914: getting variables 49915 1727204311.91915: in VariableManager get_vars() 49915 1727204311.91956: Calling all_inventory to load vars for managed-node2 49915 1727204311.91958: Calling groups_inventory to load vars for managed-node2 49915 1727204311.91961: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204311.91971: Calling all_plugins_play to load vars for managed-node2 49915 1727204311.91973: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204311.91977: Calling groups_plugins_play to load vars for managed-node2 49915 1727204311.94198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204311.96162: done with get_vars() 49915 1727204311.96187: done getting variables 49915 1727204311.96286: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49915 1727204311.96424: variable 'interface' from source: include params 49915 1727204311.96429: variable 'vlan_interface' from source: play vars 49915 1727204311.96500: variable 'vlan_interface' from source: play vars TASK [Assert that the interface is present - 'lsr101.90'] ********************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:58:31 -0400 (0:00:00.493) 0:00:18.671 ***** 49915 1727204311.96535: entering _queue_task() for managed-node2/assert 49915 1727204311.97085: worker is 1 (out of 1 available) 49915 1727204311.97098: exiting _queue_task() for managed-node2/assert 49915 1727204311.97114: done queuing things up, now waiting for results queue to drain 49915 1727204311.97115: waiting for pending results... 49915 1727204311.97710: running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'lsr101.90' 49915 1727204311.97844: in run() - task 028d2410-947f-dcd7-b5af-000000000579 49915 1727204311.97878: variable 'ansible_search_path' from source: unknown 49915 1727204311.97882: variable 'ansible_search_path' from source: unknown 49915 1727204311.97971: calling self._execute() 49915 1727204311.98025: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204311.98031: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204311.98042: variable 'omit' from source: magic vars 49915 1727204311.98418: variable 'ansible_distribution_major_version' from source: facts 49915 1727204311.98435: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204311.98441: variable 'omit' from source: magic vars 49915 1727204311.98522: variable 'omit' from source: magic vars 49915 1727204311.98566: variable 'interface' from source: include params 49915 1727204311.98570: variable 'vlan_interface' from source: play vars 49915 1727204311.98630: variable 'vlan_interface' from source: play vars 49915 1727204311.98656: variable 'omit' from source: magic vars 49915 1727204311.98697: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204311.98734: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204311.98782: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204311.98785: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204311.98788: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204311.98818: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204311.98822: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204311.98824: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204311.98959: Set connection var ansible_connection to ssh 49915 1727204311.98962: Set connection var ansible_shell_type to sh 49915 1727204311.98965: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204311.98967: Set connection var ansible_shell_executable to /bin/sh 49915 1727204311.98970: Set connection var ansible_timeout to 10 49915 1727204311.98973: Set connection var ansible_pipelining to False 49915 1727204311.98976: variable 'ansible_shell_executable' from source: unknown 49915 1727204311.98979: variable 'ansible_connection' from source: unknown 49915 1727204311.98982: variable 'ansible_module_compression' from source: unknown 49915 1727204311.98985: variable 'ansible_shell_type' from source: unknown 49915 1727204311.98987: variable 'ansible_shell_executable' from source: unknown 49915 1727204311.98989: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204311.98991: variable 'ansible_pipelining' from source: unknown 49915 1727204311.98999: variable 'ansible_timeout' from source: unknown 49915 1727204311.99002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204311.99180: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204311.99184: variable 'omit' from source: magic vars 49915 1727204311.99186: starting attempt loop 49915 1727204311.99189: running the handler 49915 1727204311.99255: variable 'interface_stat' from source: set_fact 49915 1727204311.99273: Evaluated conditional (interface_stat.stat.exists): True 49915 1727204311.99278: handler run complete 49915 1727204311.99328: attempt loop complete, returning result 49915 1727204311.99331: _execute() done 49915 1727204311.99334: dumping result to json 49915 1727204311.99337: done dumping result, returning 49915 1727204311.99339: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'lsr101.90' [028d2410-947f-dcd7-b5af-000000000579] 49915 1727204311.99341: sending task result for task 028d2410-947f-dcd7-b5af-000000000579 49915 1727204311.99402: done sending task result for task 028d2410-947f-dcd7-b5af-000000000579 49915 1727204311.99404: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 49915 1727204311.99565: no more pending results, returning what we have 49915 1727204311.99568: results queue empty 49915 1727204311.99569: checking for any_errors_fatal 49915 1727204311.99576: done checking for any_errors_fatal 49915 1727204311.99577: checking for max_fail_percentage 49915 1727204311.99579: done checking for max_fail_percentage 49915 1727204311.99580: checking to see if all hosts have failed and the running result is not ok 49915 1727204311.99581: done checking to see if all hosts have failed 49915 1727204311.99581: getting the remaining hosts for this loop 49915 1727204311.99582: done getting the remaining hosts for this loop 49915 1727204311.99586: getting the next task for host managed-node2 49915 1727204311.99592: done getting next task for host managed-node2 49915 1727204311.99595: ^ task is: TASK: Include the task 'assert_profile_present.yml' 49915 1727204311.99597: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204311.99601: getting variables 49915 1727204311.99602: in VariableManager get_vars() 49915 1727204311.99640: Calling all_inventory to load vars for managed-node2 49915 1727204311.99643: Calling groups_inventory to load vars for managed-node2 49915 1727204311.99645: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204311.99655: Calling all_plugins_play to load vars for managed-node2 49915 1727204311.99657: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204311.99660: Calling groups_plugins_play to load vars for managed-node2 49915 1727204312.01414: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204312.03644: done with get_vars() 49915 1727204312.03682: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:50 Tuesday 24 September 2024 14:58:32 -0400 (0:00:00.072) 0:00:18.743 ***** 49915 1727204312.03778: entering _queue_task() for managed-node2/include_tasks 49915 1727204312.04139: worker is 1 (out of 1 available) 49915 1727204312.04151: exiting _queue_task() for managed-node2/include_tasks 49915 1727204312.04164: done queuing things up, now waiting for results queue to drain 49915 1727204312.04165: waiting for pending results... 49915 1727204312.04594: running TaskExecutor() for managed-node2/TASK: Include the task 'assert_profile_present.yml' 49915 1727204312.04599: in run() - task 028d2410-947f-dcd7-b5af-00000000005c 49915 1727204312.04602: variable 'ansible_search_path' from source: unknown 49915 1727204312.04604: variable 'interface' from source: play vars 49915 1727204312.04803: variable 'interface' from source: play vars 49915 1727204312.04828: variable 'vlan_interface' from source: play vars 49915 1727204312.04897: variable 'vlan_interface' from source: play vars 49915 1727204312.04921: variable 'omit' from source: magic vars 49915 1727204312.05066: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204312.05082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204312.05098: variable 'omit' from source: magic vars 49915 1727204312.05399: variable 'ansible_distribution_major_version' from source: facts 49915 1727204312.05419: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204312.05454: variable 'item' from source: unknown 49915 1727204312.05810: variable 'item' from source: unknown 49915 1727204312.06499: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204312.06503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204312.06505: variable 'omit' from source: magic vars 49915 1727204312.06508: variable 'ansible_distribution_major_version' from source: facts 49915 1727204312.06510: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204312.06524: variable 'item' from source: unknown 49915 1727204312.06717: variable 'item' from source: unknown 49915 1727204312.06823: dumping result to json 49915 1727204312.06827: done dumping result, returning 49915 1727204312.06830: done running TaskExecutor() for managed-node2/TASK: Include the task 'assert_profile_present.yml' [028d2410-947f-dcd7-b5af-00000000005c] 49915 1727204312.06832: sending task result for task 028d2410-947f-dcd7-b5af-00000000005c 49915 1727204312.06972: done sending task result for task 028d2410-947f-dcd7-b5af-00000000005c 49915 1727204312.06977: WORKER PROCESS EXITING 49915 1727204312.07053: no more pending results, returning what we have 49915 1727204312.07058: in VariableManager get_vars() 49915 1727204312.07108: Calling all_inventory to load vars for managed-node2 49915 1727204312.07110: Calling groups_inventory to load vars for managed-node2 49915 1727204312.07112: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204312.07125: Calling all_plugins_play to load vars for managed-node2 49915 1727204312.07128: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204312.07130: Calling groups_plugins_play to load vars for managed-node2 49915 1727204312.10252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204312.12014: done with get_vars() 49915 1727204312.12037: variable 'ansible_search_path' from source: unknown 49915 1727204312.12053: variable 'ansible_search_path' from source: unknown 49915 1727204312.12065: we have included files to process 49915 1727204312.12067: generating all_blocks data 49915 1727204312.12068: done generating all_blocks data 49915 1727204312.12072: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 49915 1727204312.12073: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 49915 1727204312.12077: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 49915 1727204312.12461: in VariableManager get_vars() 49915 1727204312.12605: done with get_vars() 49915 1727204312.13081: done processing included file 49915 1727204312.13083: iterating over new_blocks loaded from include file 49915 1727204312.13084: in VariableManager get_vars() 49915 1727204312.13190: done with get_vars() 49915 1727204312.13192: filtering new block on tags 49915 1727204312.13265: done filtering new block on tags 49915 1727204312.13268: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node2 => (item=lsr101) 49915 1727204312.13279: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 49915 1727204312.13280: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 49915 1727204312.13284: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 49915 1727204312.13555: in VariableManager get_vars() 49915 1727204312.13592: done with get_vars() 49915 1727204312.13843: done processing included file 49915 1727204312.13845: iterating over new_blocks loaded from include file 49915 1727204312.13846: in VariableManager get_vars() 49915 1727204312.13867: done with get_vars() 49915 1727204312.13869: filtering new block on tags 49915 1727204312.13888: done filtering new block on tags 49915 1727204312.13890: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node2 => (item=lsr101.90) 49915 1727204312.13894: extending task lists for all hosts with included blocks 49915 1727204312.17057: done extending task lists 49915 1727204312.17060: done processing included files 49915 1727204312.17060: results queue empty 49915 1727204312.17061: checking for any_errors_fatal 49915 1727204312.17065: done checking for any_errors_fatal 49915 1727204312.17066: checking for max_fail_percentage 49915 1727204312.17067: done checking for max_fail_percentage 49915 1727204312.17068: checking to see if all hosts have failed and the running result is not ok 49915 1727204312.17069: done checking to see if all hosts have failed 49915 1727204312.17069: getting the remaining hosts for this loop 49915 1727204312.17070: done getting the remaining hosts for this loop 49915 1727204312.17073: getting the next task for host managed-node2 49915 1727204312.17080: done getting next task for host managed-node2 49915 1727204312.17082: ^ task is: TASK: Include the task 'get_profile_stat.yml' 49915 1727204312.17085: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204312.17087: getting variables 49915 1727204312.17088: in VariableManager get_vars() 49915 1727204312.17112: Calling all_inventory to load vars for managed-node2 49915 1727204312.17115: Calling groups_inventory to load vars for managed-node2 49915 1727204312.17118: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204312.17124: Calling all_plugins_play to load vars for managed-node2 49915 1727204312.17127: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204312.17136: Calling groups_plugins_play to load vars for managed-node2 49915 1727204312.18483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204312.20850: done with get_vars() 49915 1727204312.20885: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:58:32 -0400 (0:00:00.172) 0:00:18.916 ***** 49915 1727204312.21061: entering _queue_task() for managed-node2/include_tasks 49915 1727204312.21710: worker is 1 (out of 1 available) 49915 1727204312.21721: exiting _queue_task() for managed-node2/include_tasks 49915 1727204312.21734: done queuing things up, now waiting for results queue to drain 49915 1727204312.21736: waiting for pending results... 49915 1727204312.22224: running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' 49915 1727204312.22369: in run() - task 028d2410-947f-dcd7-b5af-0000000006b8 49915 1727204312.22391: variable 'ansible_search_path' from source: unknown 49915 1727204312.22431: variable 'ansible_search_path' from source: unknown 49915 1727204312.22458: calling self._execute() 49915 1727204312.22560: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204312.22617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204312.22620: variable 'omit' from source: magic vars 49915 1727204312.23097: variable 'ansible_distribution_major_version' from source: facts 49915 1727204312.23101: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204312.23105: _execute() done 49915 1727204312.23108: dumping result to json 49915 1727204312.23111: done dumping result, returning 49915 1727204312.23115: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' [028d2410-947f-dcd7-b5af-0000000006b8] 49915 1727204312.23117: sending task result for task 028d2410-947f-dcd7-b5af-0000000006b8 49915 1727204312.23211: done sending task result for task 028d2410-947f-dcd7-b5af-0000000006b8 49915 1727204312.23243: no more pending results, returning what we have 49915 1727204312.23250: in VariableManager get_vars() 49915 1727204312.23407: Calling all_inventory to load vars for managed-node2 49915 1727204312.23410: Calling groups_inventory to load vars for managed-node2 49915 1727204312.23412: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204312.23488: Calling all_plugins_play to load vars for managed-node2 49915 1727204312.23492: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204312.23497: Calling groups_plugins_play to load vars for managed-node2 49915 1727204312.24199: WORKER PROCESS EXITING 49915 1727204312.25169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204312.26061: done with get_vars() 49915 1727204312.26078: variable 'ansible_search_path' from source: unknown 49915 1727204312.26079: variable 'ansible_search_path' from source: unknown 49915 1727204312.26108: we have included files to process 49915 1727204312.26109: generating all_blocks data 49915 1727204312.26110: done generating all_blocks data 49915 1727204312.26113: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 49915 1727204312.26114: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 49915 1727204312.26116: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 49915 1727204312.26914: done processing included file 49915 1727204312.26916: iterating over new_blocks loaded from include file 49915 1727204312.26917: in VariableManager get_vars() 49915 1727204312.26942: done with get_vars() 49915 1727204312.26944: filtering new block on tags 49915 1727204312.26968: done filtering new block on tags 49915 1727204312.26971: in VariableManager get_vars() 49915 1727204312.26991: done with get_vars() 49915 1727204312.26993: filtering new block on tags 49915 1727204312.27015: done filtering new block on tags 49915 1727204312.27017: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node2 49915 1727204312.27022: extending task lists for all hosts with included blocks 49915 1727204312.27178: done extending task lists 49915 1727204312.27180: done processing included files 49915 1727204312.27181: results queue empty 49915 1727204312.27181: checking for any_errors_fatal 49915 1727204312.27185: done checking for any_errors_fatal 49915 1727204312.27185: checking for max_fail_percentage 49915 1727204312.27186: done checking for max_fail_percentage 49915 1727204312.27187: checking to see if all hosts have failed and the running result is not ok 49915 1727204312.27188: done checking to see if all hosts have failed 49915 1727204312.27189: getting the remaining hosts for this loop 49915 1727204312.27190: done getting the remaining hosts for this loop 49915 1727204312.27192: getting the next task for host managed-node2 49915 1727204312.27196: done getting next task for host managed-node2 49915 1727204312.27198: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 49915 1727204312.27201: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204312.27203: getting variables 49915 1727204312.27204: in VariableManager get_vars() 49915 1727204312.27278: Calling all_inventory to load vars for managed-node2 49915 1727204312.27284: Calling groups_inventory to load vars for managed-node2 49915 1727204312.27286: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204312.27292: Calling all_plugins_play to load vars for managed-node2 49915 1727204312.27294: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204312.27297: Calling groups_plugins_play to load vars for managed-node2 49915 1727204312.28299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204312.29408: done with get_vars() 49915 1727204312.29430: done getting variables 49915 1727204312.29460: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:58:32 -0400 (0:00:00.084) 0:00:19.001 ***** 49915 1727204312.29487: entering _queue_task() for managed-node2/set_fact 49915 1727204312.29752: worker is 1 (out of 1 available) 49915 1727204312.29766: exiting _queue_task() for managed-node2/set_fact 49915 1727204312.29779: done queuing things up, now waiting for results queue to drain 49915 1727204312.29781: waiting for pending results... 49915 1727204312.29952: running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag 49915 1727204312.30035: in run() - task 028d2410-947f-dcd7-b5af-0000000007f0 49915 1727204312.30047: variable 'ansible_search_path' from source: unknown 49915 1727204312.30050: variable 'ansible_search_path' from source: unknown 49915 1727204312.30080: calling self._execute() 49915 1727204312.30150: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204312.30154: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204312.30163: variable 'omit' from source: magic vars 49915 1727204312.30456: variable 'ansible_distribution_major_version' from source: facts 49915 1727204312.30467: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204312.30471: variable 'omit' from source: magic vars 49915 1727204312.30507: variable 'omit' from source: magic vars 49915 1727204312.30533: variable 'omit' from source: magic vars 49915 1727204312.30570: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204312.30601: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204312.30619: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204312.30633: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204312.30643: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204312.30672: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204312.30677: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204312.30680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204312.30745: Set connection var ansible_connection to ssh 49915 1727204312.30748: Set connection var ansible_shell_type to sh 49915 1727204312.30753: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204312.30764: Set connection var ansible_shell_executable to /bin/sh 49915 1727204312.30767: Set connection var ansible_timeout to 10 49915 1727204312.30774: Set connection var ansible_pipelining to False 49915 1727204312.30795: variable 'ansible_shell_executable' from source: unknown 49915 1727204312.30798: variable 'ansible_connection' from source: unknown 49915 1727204312.30801: variable 'ansible_module_compression' from source: unknown 49915 1727204312.30803: variable 'ansible_shell_type' from source: unknown 49915 1727204312.30806: variable 'ansible_shell_executable' from source: unknown 49915 1727204312.30808: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204312.30810: variable 'ansible_pipelining' from source: unknown 49915 1727204312.30815: variable 'ansible_timeout' from source: unknown 49915 1727204312.30817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204312.30923: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204312.30932: variable 'omit' from source: magic vars 49915 1727204312.30937: starting attempt loop 49915 1727204312.30940: running the handler 49915 1727204312.30951: handler run complete 49915 1727204312.30960: attempt loop complete, returning result 49915 1727204312.30963: _execute() done 49915 1727204312.30965: dumping result to json 49915 1727204312.30968: done dumping result, returning 49915 1727204312.30974: done running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag [028d2410-947f-dcd7-b5af-0000000007f0] 49915 1727204312.30982: sending task result for task 028d2410-947f-dcd7-b5af-0000000007f0 49915 1727204312.31061: done sending task result for task 028d2410-947f-dcd7-b5af-0000000007f0 49915 1727204312.31065: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 49915 1727204312.31139: no more pending results, returning what we have 49915 1727204312.31142: results queue empty 49915 1727204312.31143: checking for any_errors_fatal 49915 1727204312.31144: done checking for any_errors_fatal 49915 1727204312.31145: checking for max_fail_percentage 49915 1727204312.31147: done checking for max_fail_percentage 49915 1727204312.31147: checking to see if all hosts have failed and the running result is not ok 49915 1727204312.31148: done checking to see if all hosts have failed 49915 1727204312.31149: getting the remaining hosts for this loop 49915 1727204312.31150: done getting the remaining hosts for this loop 49915 1727204312.31154: getting the next task for host managed-node2 49915 1727204312.31162: done getting next task for host managed-node2 49915 1727204312.31164: ^ task is: TASK: Stat profile file 49915 1727204312.31168: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204312.31172: getting variables 49915 1727204312.31176: in VariableManager get_vars() 49915 1727204312.31220: Calling all_inventory to load vars for managed-node2 49915 1727204312.31223: Calling groups_inventory to load vars for managed-node2 49915 1727204312.31225: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204312.31234: Calling all_plugins_play to load vars for managed-node2 49915 1727204312.31236: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204312.31238: Calling groups_plugins_play to load vars for managed-node2 49915 1727204312.32466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204312.33487: done with get_vars() 49915 1727204312.33510: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:58:32 -0400 (0:00:00.040) 0:00:19.042 ***** 49915 1727204312.33583: entering _queue_task() for managed-node2/stat 49915 1727204312.33842: worker is 1 (out of 1 available) 49915 1727204312.33855: exiting _queue_task() for managed-node2/stat 49915 1727204312.33868: done queuing things up, now waiting for results queue to drain 49915 1727204312.33869: waiting for pending results... 49915 1727204312.34051: running TaskExecutor() for managed-node2/TASK: Stat profile file 49915 1727204312.34134: in run() - task 028d2410-947f-dcd7-b5af-0000000007f1 49915 1727204312.34146: variable 'ansible_search_path' from source: unknown 49915 1727204312.34150: variable 'ansible_search_path' from source: unknown 49915 1727204312.34180: calling self._execute() 49915 1727204312.34252: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204312.34256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204312.34265: variable 'omit' from source: magic vars 49915 1727204312.34547: variable 'ansible_distribution_major_version' from source: facts 49915 1727204312.34558: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204312.34563: variable 'omit' from source: magic vars 49915 1727204312.34594: variable 'omit' from source: magic vars 49915 1727204312.34668: variable 'profile' from source: include params 49915 1727204312.34672: variable 'item' from source: include params 49915 1727204312.34724: variable 'item' from source: include params 49915 1727204312.34739: variable 'omit' from source: magic vars 49915 1727204312.34777: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204312.34804: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204312.34822: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204312.34836: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204312.34846: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204312.34874: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204312.34879: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204312.34882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204312.34946: Set connection var ansible_connection to ssh 49915 1727204312.34950: Set connection var ansible_shell_type to sh 49915 1727204312.34955: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204312.34965: Set connection var ansible_shell_executable to /bin/sh 49915 1727204312.34969: Set connection var ansible_timeout to 10 49915 1727204312.34980: Set connection var ansible_pipelining to False 49915 1727204312.34996: variable 'ansible_shell_executable' from source: unknown 49915 1727204312.34999: variable 'ansible_connection' from source: unknown 49915 1727204312.35001: variable 'ansible_module_compression' from source: unknown 49915 1727204312.35004: variable 'ansible_shell_type' from source: unknown 49915 1727204312.35006: variable 'ansible_shell_executable' from source: unknown 49915 1727204312.35008: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204312.35013: variable 'ansible_pipelining' from source: unknown 49915 1727204312.35018: variable 'ansible_timeout' from source: unknown 49915 1727204312.35022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204312.35171: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 49915 1727204312.35182: variable 'omit' from source: magic vars 49915 1727204312.35192: starting attempt loop 49915 1727204312.35196: running the handler 49915 1727204312.35201: _low_level_execute_command(): starting 49915 1727204312.35209: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49915 1727204312.35736: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204312.35741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 49915 1727204312.35745: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204312.35785: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204312.35793: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204312.35890: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204312.37600: stdout chunk (state=3): >>>/root <<< 49915 1727204312.37699: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204312.37731: stderr chunk (state=3): >>><<< 49915 1727204312.37735: stdout chunk (state=3): >>><<< 49915 1727204312.37755: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204312.37772: _low_level_execute_command(): starting 49915 1727204312.37782: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204312.3775568-51230-14323561500152 `" && echo ansible-tmp-1727204312.3775568-51230-14323561500152="` echo /root/.ansible/tmp/ansible-tmp-1727204312.3775568-51230-14323561500152 `" ) && sleep 0' 49915 1727204312.38248: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204312.38252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204312.38254: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204312.38264: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204312.38266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204312.38268: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204312.38314: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204312.38318: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204312.38324: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204312.38397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204312.40316: stdout chunk (state=3): >>>ansible-tmp-1727204312.3775568-51230-14323561500152=/root/.ansible/tmp/ansible-tmp-1727204312.3775568-51230-14323561500152 <<< 49915 1727204312.40417: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204312.40445: stderr chunk (state=3): >>><<< 49915 1727204312.40448: stdout chunk (state=3): >>><<< 49915 1727204312.40465: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204312.3775568-51230-14323561500152=/root/.ansible/tmp/ansible-tmp-1727204312.3775568-51230-14323561500152 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204312.40511: variable 'ansible_module_compression' from source: unknown 49915 1727204312.40558: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49915ogiz3nec/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 49915 1727204312.40592: variable 'ansible_facts' from source: unknown 49915 1727204312.40655: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204312.3775568-51230-14323561500152/AnsiballZ_stat.py 49915 1727204312.40765: Sending initial data 49915 1727204312.40768: Sent initial data (152 bytes) 49915 1727204312.41247: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204312.41251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 49915 1727204312.41253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204312.41256: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration <<< 49915 1727204312.41259: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204312.41307: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204312.41310: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204312.41316: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204312.41389: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204312.42938: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49915 1727204312.43006: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49915 1727204312.43086: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49915ogiz3nec/tmpw7efgg7z /root/.ansible/tmp/ansible-tmp-1727204312.3775568-51230-14323561500152/AnsiballZ_stat.py <<< 49915 1727204312.43089: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204312.3775568-51230-14323561500152/AnsiballZ_stat.py" <<< 49915 1727204312.43146: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49915ogiz3nec/tmpw7efgg7z" to remote "/root/.ansible/tmp/ansible-tmp-1727204312.3775568-51230-14323561500152/AnsiballZ_stat.py" <<< 49915 1727204312.43151: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204312.3775568-51230-14323561500152/AnsiballZ_stat.py" <<< 49915 1727204312.43783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204312.43828: stderr chunk (state=3): >>><<< 49915 1727204312.43831: stdout chunk (state=3): >>><<< 49915 1727204312.43875: done transferring module to remote 49915 1727204312.43885: _low_level_execute_command(): starting 49915 1727204312.43889: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204312.3775568-51230-14323561500152/ /root/.ansible/tmp/ansible-tmp-1727204312.3775568-51230-14323561500152/AnsiballZ_stat.py && sleep 0' 49915 1727204312.44346: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204312.44349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204312.44351: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration <<< 49915 1727204312.44357: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204312.44359: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204312.44416: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204312.44419: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204312.44486: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204312.46257: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204312.46284: stderr chunk (state=3): >>><<< 49915 1727204312.46287: stdout chunk (state=3): >>><<< 49915 1727204312.46301: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204312.46305: _low_level_execute_command(): starting 49915 1727204312.46314: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204312.3775568-51230-14323561500152/AnsiballZ_stat.py && sleep 0' 49915 1727204312.46768: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204312.46773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204312.46778: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204312.46780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204312.46832: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204312.46835: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204312.46838: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204312.46920: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204312.62787: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr101", "follow": false, "checksum_algorithm": "sha1"}}} <<< 49915 1727204312.64104: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 49915 1727204312.64108: stdout chunk (state=3): >>><<< 49915 1727204312.64111: stderr chunk (state=3): >>><<< 49915 1727204312.64416: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr101", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 49915 1727204312.64421: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-lsr101', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204312.3775568-51230-14323561500152/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49915 1727204312.64424: _low_level_execute_command(): starting 49915 1727204312.64426: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204312.3775568-51230-14323561500152/ > /dev/null 2>&1 && sleep 0' 49915 1727204312.65309: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204312.65424: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204312.65531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204312.65536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204312.65538: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 49915 1727204312.65671: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204312.65701: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204312.65727: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204312.65760: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204312.66106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204312.68023: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204312.68027: stdout chunk (state=3): >>><<< 49915 1727204312.68037: stderr chunk (state=3): >>><<< 49915 1727204312.68223: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204312.68227: handler run complete 49915 1727204312.68230: attempt loop complete, returning result 49915 1727204312.68234: _execute() done 49915 1727204312.68236: dumping result to json 49915 1727204312.68239: done dumping result, returning 49915 1727204312.68241: done running TaskExecutor() for managed-node2/TASK: Stat profile file [028d2410-947f-dcd7-b5af-0000000007f1] 49915 1727204312.68243: sending task result for task 028d2410-947f-dcd7-b5af-0000000007f1 ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 49915 1727204312.68730: no more pending results, returning what we have 49915 1727204312.68734: results queue empty 49915 1727204312.68735: checking for any_errors_fatal 49915 1727204312.68742: done checking for any_errors_fatal 49915 1727204312.68743: checking for max_fail_percentage 49915 1727204312.68745: done checking for max_fail_percentage 49915 1727204312.68746: checking to see if all hosts have failed and the running result is not ok 49915 1727204312.68748: done checking to see if all hosts have failed 49915 1727204312.68748: getting the remaining hosts for this loop 49915 1727204312.68750: done getting the remaining hosts for this loop 49915 1727204312.68754: getting the next task for host managed-node2 49915 1727204312.68763: done getting next task for host managed-node2 49915 1727204312.68766: ^ task is: TASK: Set NM profile exist flag based on the profile files 49915 1727204312.68771: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204312.68777: getting variables 49915 1727204312.68779: in VariableManager get_vars() 49915 1727204312.68831: Calling all_inventory to load vars for managed-node2 49915 1727204312.68834: Calling groups_inventory to load vars for managed-node2 49915 1727204312.68837: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204312.68849: Calling all_plugins_play to load vars for managed-node2 49915 1727204312.68852: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204312.68856: Calling groups_plugins_play to load vars for managed-node2 49915 1727204312.69774: done sending task result for task 028d2410-947f-dcd7-b5af-0000000007f1 49915 1727204312.70346: WORKER PROCESS EXITING 49915 1727204312.71741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204312.74542: done with get_vars() 49915 1727204312.74680: done getting variables 49915 1727204312.74751: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:58:32 -0400 (0:00:00.411) 0:00:19.454 ***** 49915 1727204312.74785: entering _queue_task() for managed-node2/set_fact 49915 1727204312.75580: worker is 1 (out of 1 available) 49915 1727204312.75706: exiting _queue_task() for managed-node2/set_fact 49915 1727204312.75722: done queuing things up, now waiting for results queue to drain 49915 1727204312.75723: waiting for pending results... 49915 1727204312.76056: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files 49915 1727204312.76168: in run() - task 028d2410-947f-dcd7-b5af-0000000007f2 49915 1727204312.76473: variable 'ansible_search_path' from source: unknown 49915 1727204312.76484: variable 'ansible_search_path' from source: unknown 49915 1727204312.76625: calling self._execute() 49915 1727204312.76847: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204312.76859: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204312.76874: variable 'omit' from source: magic vars 49915 1727204312.77886: variable 'ansible_distribution_major_version' from source: facts 49915 1727204312.77890: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204312.78051: variable 'profile_stat' from source: set_fact 49915 1727204312.78071: Evaluated conditional (profile_stat.stat.exists): False 49915 1727204312.78081: when evaluation is False, skipping this task 49915 1727204312.78089: _execute() done 49915 1727204312.78105: dumping result to json 49915 1727204312.78210: done dumping result, returning 49915 1727204312.78217: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files [028d2410-947f-dcd7-b5af-0000000007f2] 49915 1727204312.78220: sending task result for task 028d2410-947f-dcd7-b5af-0000000007f2 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 49915 1727204312.78364: no more pending results, returning what we have 49915 1727204312.78368: results queue empty 49915 1727204312.78369: checking for any_errors_fatal 49915 1727204312.78379: done checking for any_errors_fatal 49915 1727204312.78379: checking for max_fail_percentage 49915 1727204312.78382: done checking for max_fail_percentage 49915 1727204312.78383: checking to see if all hosts have failed and the running result is not ok 49915 1727204312.78384: done checking to see if all hosts have failed 49915 1727204312.78385: getting the remaining hosts for this loop 49915 1727204312.78387: done getting the remaining hosts for this loop 49915 1727204312.78391: getting the next task for host managed-node2 49915 1727204312.78398: done getting next task for host managed-node2 49915 1727204312.78400: ^ task is: TASK: Get NM profile info 49915 1727204312.78405: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204312.78409: getting variables 49915 1727204312.78411: in VariableManager get_vars() 49915 1727204312.78461: Calling all_inventory to load vars for managed-node2 49915 1727204312.78464: Calling groups_inventory to load vars for managed-node2 49915 1727204312.78467: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204312.78589: Calling all_plugins_play to load vars for managed-node2 49915 1727204312.78594: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204312.78598: Calling groups_plugins_play to load vars for managed-node2 49915 1727204312.79466: done sending task result for task 028d2410-947f-dcd7-b5af-0000000007f2 49915 1727204312.79469: WORKER PROCESS EXITING 49915 1727204312.81682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204312.83639: done with get_vars() 49915 1727204312.83681: done getting variables 49915 1727204312.83792: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:58:32 -0400 (0:00:00.090) 0:00:19.544 ***** 49915 1727204312.83829: entering _queue_task() for managed-node2/shell 49915 1727204312.83831: Creating lock for shell 49915 1727204312.84698: worker is 1 (out of 1 available) 49915 1727204312.84829: exiting _queue_task() for managed-node2/shell 49915 1727204312.84841: done queuing things up, now waiting for results queue to drain 49915 1727204312.84842: waiting for pending results... 49915 1727204312.85096: running TaskExecutor() for managed-node2/TASK: Get NM profile info 49915 1727204312.85249: in run() - task 028d2410-947f-dcd7-b5af-0000000007f3 49915 1727204312.85265: variable 'ansible_search_path' from source: unknown 49915 1727204312.85269: variable 'ansible_search_path' from source: unknown 49915 1727204312.85315: calling self._execute() 49915 1727204312.85402: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204312.85406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204312.85423: variable 'omit' from source: magic vars 49915 1727204312.85847: variable 'ansible_distribution_major_version' from source: facts 49915 1727204312.85852: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204312.85855: variable 'omit' from source: magic vars 49915 1727204312.85859: variable 'omit' from source: magic vars 49915 1727204312.85965: variable 'profile' from source: include params 49915 1727204312.85969: variable 'item' from source: include params 49915 1727204312.86065: variable 'item' from source: include params 49915 1727204312.86068: variable 'omit' from source: magic vars 49915 1727204312.86095: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204312.86130: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204312.86173: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204312.86178: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204312.86181: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204312.86282: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204312.86286: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204312.86288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204312.86316: Set connection var ansible_connection to ssh 49915 1727204312.86319: Set connection var ansible_shell_type to sh 49915 1727204312.86326: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204312.86390: Set connection var ansible_shell_executable to /bin/sh 49915 1727204312.86394: Set connection var ansible_timeout to 10 49915 1727204312.86396: Set connection var ansible_pipelining to False 49915 1727204312.86399: variable 'ansible_shell_executable' from source: unknown 49915 1727204312.86401: variable 'ansible_connection' from source: unknown 49915 1727204312.86403: variable 'ansible_module_compression' from source: unknown 49915 1727204312.86405: variable 'ansible_shell_type' from source: unknown 49915 1727204312.86407: variable 'ansible_shell_executable' from source: unknown 49915 1727204312.86409: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204312.86414: variable 'ansible_pipelining' from source: unknown 49915 1727204312.86417: variable 'ansible_timeout' from source: unknown 49915 1727204312.86419: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204312.86645: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204312.86649: variable 'omit' from source: magic vars 49915 1727204312.86651: starting attempt loop 49915 1727204312.86653: running the handler 49915 1727204312.86657: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204312.86669: _low_level_execute_command(): starting 49915 1727204312.86672: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49915 1727204312.87494: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204312.87543: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204312.87557: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204312.87656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204312.89354: stdout chunk (state=3): >>>/root <<< 49915 1727204312.89507: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204312.89593: stderr chunk (state=3): >>><<< 49915 1727204312.89959: stdout chunk (state=3): >>><<< 49915 1727204312.89963: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204312.89966: _low_level_execute_command(): starting 49915 1727204312.89969: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204312.897561-51249-86990876141331 `" && echo ansible-tmp-1727204312.897561-51249-86990876141331="` echo /root/.ansible/tmp/ansible-tmp-1727204312.897561-51249-86990876141331 `" ) && sleep 0' 49915 1727204312.91182: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204312.91278: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204312.91396: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204312.91431: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204312.91460: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204312.91613: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204312.93564: stdout chunk (state=3): >>>ansible-tmp-1727204312.897561-51249-86990876141331=/root/.ansible/tmp/ansible-tmp-1727204312.897561-51249-86990876141331 <<< 49915 1727204312.93752: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204312.93762: stdout chunk (state=3): >>><<< 49915 1727204312.93789: stderr chunk (state=3): >>><<< 49915 1727204312.93816: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204312.897561-51249-86990876141331=/root/.ansible/tmp/ansible-tmp-1727204312.897561-51249-86990876141331 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204312.93858: variable 'ansible_module_compression' from source: unknown 49915 1727204312.93934: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49915ogiz3nec/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 49915 1727204312.93982: variable 'ansible_facts' from source: unknown 49915 1727204312.94247: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204312.897561-51249-86990876141331/AnsiballZ_command.py 49915 1727204312.94418: Sending initial data 49915 1727204312.94421: Sent initial data (154 bytes) 49915 1727204312.95498: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204312.95532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204312.95630: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 49915 1727204312.95633: stderr chunk (state=3): >>>debug2: match not found <<< 49915 1727204312.95646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204312.95752: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204312.95837: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204312.95874: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204312.95955: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204312.97545: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49915 1727204312.97611: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49915 1727204312.97707: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49915ogiz3nec/tmp0zyxlqfk /root/.ansible/tmp/ansible-tmp-1727204312.897561-51249-86990876141331/AnsiballZ_command.py <<< 49915 1727204312.97710: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204312.897561-51249-86990876141331/AnsiballZ_command.py" <<< 49915 1727204312.97765: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49915ogiz3nec/tmp0zyxlqfk" to remote "/root/.ansible/tmp/ansible-tmp-1727204312.897561-51249-86990876141331/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204312.897561-51249-86990876141331/AnsiballZ_command.py" <<< 49915 1727204312.98749: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204312.98752: stdout chunk (state=3): >>><<< 49915 1727204312.98755: stderr chunk (state=3): >>><<< 49915 1727204312.98757: done transferring module to remote 49915 1727204312.98759: _low_level_execute_command(): starting 49915 1727204312.98761: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204312.897561-51249-86990876141331/ /root/.ansible/tmp/ansible-tmp-1727204312.897561-51249-86990876141331/AnsiballZ_command.py && sleep 0' 49915 1727204312.99390: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204312.99393: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204312.99461: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204312.99474: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204312.99498: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204312.99601: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204313.01479: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204313.01483: stdout chunk (state=3): >>><<< 49915 1727204313.01485: stderr chunk (state=3): >>><<< 49915 1727204313.01685: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204313.01689: _low_level_execute_command(): starting 49915 1727204313.01692: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204312.897561-51249-86990876141331/AnsiballZ_command.py && sleep 0' 49915 1727204313.02475: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204313.02555: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204313.02640: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204313.19714: stdout chunk (state=3): >>> {"changed": true, "stdout": "lsr101 /etc/NetworkManager/system-connections/lsr101.nmconnection \nlsr101.90 /etc/NetworkManager/system-connections/lsr101.90.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr101 | grep /etc", "start": "2024-09-24 14:58:33.176028", "end": "2024-09-24 14:58:33.195673", "delta": "0:00:00.019645", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr101 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 49915 1727204313.21242: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 49915 1727204313.21268: stderr chunk (state=3): >>><<< 49915 1727204313.21272: stdout chunk (state=3): >>><<< 49915 1727204313.21296: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "lsr101 /etc/NetworkManager/system-connections/lsr101.nmconnection \nlsr101.90 /etc/NetworkManager/system-connections/lsr101.90.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr101 | grep /etc", "start": "2024-09-24 14:58:33.176028", "end": "2024-09-24 14:58:33.195673", "delta": "0:00:00.019645", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr101 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 49915 1727204313.21326: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep lsr101 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204312.897561-51249-86990876141331/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49915 1727204313.21333: _low_level_execute_command(): starting 49915 1727204313.21338: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204312.897561-51249-86990876141331/ > /dev/null 2>&1 && sleep 0' 49915 1727204313.21773: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204313.21779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204313.21792: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204313.21856: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204313.21859: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204313.21929: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204313.23761: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204313.23790: stderr chunk (state=3): >>><<< 49915 1727204313.23793: stdout chunk (state=3): >>><<< 49915 1727204313.23804: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204313.23811: handler run complete 49915 1727204313.23829: Evaluated conditional (False): False 49915 1727204313.23837: attempt loop complete, returning result 49915 1727204313.23840: _execute() done 49915 1727204313.23842: dumping result to json 49915 1727204313.23847: done dumping result, returning 49915 1727204313.23856: done running TaskExecutor() for managed-node2/TASK: Get NM profile info [028d2410-947f-dcd7-b5af-0000000007f3] 49915 1727204313.23861: sending task result for task 028d2410-947f-dcd7-b5af-0000000007f3 49915 1727204313.23961: done sending task result for task 028d2410-947f-dcd7-b5af-0000000007f3 49915 1727204313.23964: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr101 | grep /etc", "delta": "0:00:00.019645", "end": "2024-09-24 14:58:33.195673", "rc": 0, "start": "2024-09-24 14:58:33.176028" } STDOUT: lsr101 /etc/NetworkManager/system-connections/lsr101.nmconnection lsr101.90 /etc/NetworkManager/system-connections/lsr101.90.nmconnection 49915 1727204313.24032: no more pending results, returning what we have 49915 1727204313.24035: results queue empty 49915 1727204313.24036: checking for any_errors_fatal 49915 1727204313.24041: done checking for any_errors_fatal 49915 1727204313.24042: checking for max_fail_percentage 49915 1727204313.24044: done checking for max_fail_percentage 49915 1727204313.24044: checking to see if all hosts have failed and the running result is not ok 49915 1727204313.24045: done checking to see if all hosts have failed 49915 1727204313.24046: getting the remaining hosts for this loop 49915 1727204313.24047: done getting the remaining hosts for this loop 49915 1727204313.24051: getting the next task for host managed-node2 49915 1727204313.24058: done getting next task for host managed-node2 49915 1727204313.24060: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 49915 1727204313.24065: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204313.24068: getting variables 49915 1727204313.24069: in VariableManager get_vars() 49915 1727204313.24114: Calling all_inventory to load vars for managed-node2 49915 1727204313.24117: Calling groups_inventory to load vars for managed-node2 49915 1727204313.24120: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204313.24130: Calling all_plugins_play to load vars for managed-node2 49915 1727204313.24132: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204313.24134: Calling groups_plugins_play to load vars for managed-node2 49915 1727204313.25037: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204313.25896: done with get_vars() 49915 1727204313.25912: done getting variables 49915 1727204313.25957: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:58:33 -0400 (0:00:00.421) 0:00:19.966 ***** 49915 1727204313.25980: entering _queue_task() for managed-node2/set_fact 49915 1727204313.26207: worker is 1 (out of 1 available) 49915 1727204313.26220: exiting _queue_task() for managed-node2/set_fact 49915 1727204313.26231: done queuing things up, now waiting for results queue to drain 49915 1727204313.26232: waiting for pending results... 49915 1727204313.26404: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 49915 1727204313.26474: in run() - task 028d2410-947f-dcd7-b5af-0000000007f4 49915 1727204313.26487: variable 'ansible_search_path' from source: unknown 49915 1727204313.26492: variable 'ansible_search_path' from source: unknown 49915 1727204313.26523: calling self._execute() 49915 1727204313.26595: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204313.26599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204313.26608: variable 'omit' from source: magic vars 49915 1727204313.26895: variable 'ansible_distribution_major_version' from source: facts 49915 1727204313.26908: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204313.26994: variable 'nm_profile_exists' from source: set_fact 49915 1727204313.27009: Evaluated conditional (nm_profile_exists.rc == 0): True 49915 1727204313.27012: variable 'omit' from source: magic vars 49915 1727204313.27048: variable 'omit' from source: magic vars 49915 1727204313.27069: variable 'omit' from source: magic vars 49915 1727204313.27103: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204313.27135: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204313.27150: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204313.27164: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204313.27174: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204313.27199: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204313.27202: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204313.27205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204313.27277: Set connection var ansible_connection to ssh 49915 1727204313.27280: Set connection var ansible_shell_type to sh 49915 1727204313.27285: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204313.27293: Set connection var ansible_shell_executable to /bin/sh 49915 1727204313.27298: Set connection var ansible_timeout to 10 49915 1727204313.27304: Set connection var ansible_pipelining to False 49915 1727204313.27325: variable 'ansible_shell_executable' from source: unknown 49915 1727204313.27328: variable 'ansible_connection' from source: unknown 49915 1727204313.27330: variable 'ansible_module_compression' from source: unknown 49915 1727204313.27332: variable 'ansible_shell_type' from source: unknown 49915 1727204313.27334: variable 'ansible_shell_executable' from source: unknown 49915 1727204313.27336: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204313.27340: variable 'ansible_pipelining' from source: unknown 49915 1727204313.27343: variable 'ansible_timeout' from source: unknown 49915 1727204313.27345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204313.27448: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204313.27468: variable 'omit' from source: magic vars 49915 1727204313.27471: starting attempt loop 49915 1727204313.27474: running the handler 49915 1727204313.27478: handler run complete 49915 1727204313.27487: attempt loop complete, returning result 49915 1727204313.27489: _execute() done 49915 1727204313.27491: dumping result to json 49915 1727204313.27494: done dumping result, returning 49915 1727204313.27501: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [028d2410-947f-dcd7-b5af-0000000007f4] 49915 1727204313.27506: sending task result for task 028d2410-947f-dcd7-b5af-0000000007f4 49915 1727204313.27591: done sending task result for task 028d2410-947f-dcd7-b5af-0000000007f4 49915 1727204313.27594: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 49915 1727204313.27645: no more pending results, returning what we have 49915 1727204313.27648: results queue empty 49915 1727204313.27649: checking for any_errors_fatal 49915 1727204313.27656: done checking for any_errors_fatal 49915 1727204313.27657: checking for max_fail_percentage 49915 1727204313.27659: done checking for max_fail_percentage 49915 1727204313.27659: checking to see if all hosts have failed and the running result is not ok 49915 1727204313.27661: done checking to see if all hosts have failed 49915 1727204313.27661: getting the remaining hosts for this loop 49915 1727204313.27663: done getting the remaining hosts for this loop 49915 1727204313.27666: getting the next task for host managed-node2 49915 1727204313.27683: done getting next task for host managed-node2 49915 1727204313.27685: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 49915 1727204313.27689: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204313.27692: getting variables 49915 1727204313.27693: in VariableManager get_vars() 49915 1727204313.27729: Calling all_inventory to load vars for managed-node2 49915 1727204313.27731: Calling groups_inventory to load vars for managed-node2 49915 1727204313.27733: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204313.27742: Calling all_plugins_play to load vars for managed-node2 49915 1727204313.27745: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204313.27747: Calling groups_plugins_play to load vars for managed-node2 49915 1727204313.28512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204313.29460: done with get_vars() 49915 1727204313.29474: done getting variables 49915 1727204313.29520: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49915 1727204313.29605: variable 'profile' from source: include params 49915 1727204313.29608: variable 'item' from source: include params 49915 1727204313.29650: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-lsr101] ************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:58:33 -0400 (0:00:00.036) 0:00:20.003 ***** 49915 1727204313.29680: entering _queue_task() for managed-node2/command 49915 1727204313.29908: worker is 1 (out of 1 available) 49915 1727204313.29922: exiting _queue_task() for managed-node2/command 49915 1727204313.29934: done queuing things up, now waiting for results queue to drain 49915 1727204313.29936: waiting for pending results... 49915 1727204313.30109: running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-lsr101 49915 1727204313.30212: in run() - task 028d2410-947f-dcd7-b5af-0000000007f6 49915 1727204313.30226: variable 'ansible_search_path' from source: unknown 49915 1727204313.30230: variable 'ansible_search_path' from source: unknown 49915 1727204313.30256: calling self._execute() 49915 1727204313.30330: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204313.30333: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204313.30342: variable 'omit' from source: magic vars 49915 1727204313.30601: variable 'ansible_distribution_major_version' from source: facts 49915 1727204313.30613: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204313.30699: variable 'profile_stat' from source: set_fact 49915 1727204313.30710: Evaluated conditional (profile_stat.stat.exists): False 49915 1727204313.30714: when evaluation is False, skipping this task 49915 1727204313.30717: _execute() done 49915 1727204313.30723: dumping result to json 49915 1727204313.30725: done dumping result, returning 49915 1727204313.30730: done running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-lsr101 [028d2410-947f-dcd7-b5af-0000000007f6] 49915 1727204313.30736: sending task result for task 028d2410-947f-dcd7-b5af-0000000007f6 49915 1727204313.30813: done sending task result for task 028d2410-947f-dcd7-b5af-0000000007f6 49915 1727204313.30816: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 49915 1727204313.30870: no more pending results, returning what we have 49915 1727204313.30874: results queue empty 49915 1727204313.30876: checking for any_errors_fatal 49915 1727204313.30883: done checking for any_errors_fatal 49915 1727204313.30884: checking for max_fail_percentage 49915 1727204313.30886: done checking for max_fail_percentage 49915 1727204313.30886: checking to see if all hosts have failed and the running result is not ok 49915 1727204313.30887: done checking to see if all hosts have failed 49915 1727204313.30888: getting the remaining hosts for this loop 49915 1727204313.30889: done getting the remaining hosts for this loop 49915 1727204313.30892: getting the next task for host managed-node2 49915 1727204313.30900: done getting next task for host managed-node2 49915 1727204313.30902: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 49915 1727204313.30905: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204313.30909: getting variables 49915 1727204313.30910: in VariableManager get_vars() 49915 1727204313.30947: Calling all_inventory to load vars for managed-node2 49915 1727204313.30950: Calling groups_inventory to load vars for managed-node2 49915 1727204313.30952: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204313.30961: Calling all_plugins_play to load vars for managed-node2 49915 1727204313.30963: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204313.30966: Calling groups_plugins_play to load vars for managed-node2 49915 1727204313.31799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204313.33144: done with get_vars() 49915 1727204313.33161: done getting variables 49915 1727204313.33204: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49915 1727204313.33282: variable 'profile' from source: include params 49915 1727204313.33285: variable 'item' from source: include params 49915 1727204313.33326: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-lsr101] ********************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:58:33 -0400 (0:00:00.036) 0:00:20.039 ***** 49915 1727204313.33349: entering _queue_task() for managed-node2/set_fact 49915 1727204313.33566: worker is 1 (out of 1 available) 49915 1727204313.33581: exiting _queue_task() for managed-node2/set_fact 49915 1727204313.33593: done queuing things up, now waiting for results queue to drain 49915 1727204313.33594: waiting for pending results... 49915 1727204313.33765: running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-lsr101 49915 1727204313.33849: in run() - task 028d2410-947f-dcd7-b5af-0000000007f7 49915 1727204313.33860: variable 'ansible_search_path' from source: unknown 49915 1727204313.33864: variable 'ansible_search_path' from source: unknown 49915 1727204313.33900: calling self._execute() 49915 1727204313.33971: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204313.33975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204313.33987: variable 'omit' from source: magic vars 49915 1727204313.34239: variable 'ansible_distribution_major_version' from source: facts 49915 1727204313.34247: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204313.34331: variable 'profile_stat' from source: set_fact 49915 1727204313.34342: Evaluated conditional (profile_stat.stat.exists): False 49915 1727204313.34345: when evaluation is False, skipping this task 49915 1727204313.34348: _execute() done 49915 1727204313.34350: dumping result to json 49915 1727204313.34353: done dumping result, returning 49915 1727204313.34361: done running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-lsr101 [028d2410-947f-dcd7-b5af-0000000007f7] 49915 1727204313.34363: sending task result for task 028d2410-947f-dcd7-b5af-0000000007f7 49915 1727204313.34444: done sending task result for task 028d2410-947f-dcd7-b5af-0000000007f7 49915 1727204313.34447: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 49915 1727204313.34523: no more pending results, returning what we have 49915 1727204313.34526: results queue empty 49915 1727204313.34527: checking for any_errors_fatal 49915 1727204313.34531: done checking for any_errors_fatal 49915 1727204313.34531: checking for max_fail_percentage 49915 1727204313.34533: done checking for max_fail_percentage 49915 1727204313.34534: checking to see if all hosts have failed and the running result is not ok 49915 1727204313.34534: done checking to see if all hosts have failed 49915 1727204313.34535: getting the remaining hosts for this loop 49915 1727204313.34536: done getting the remaining hosts for this loop 49915 1727204313.34539: getting the next task for host managed-node2 49915 1727204313.34544: done getting next task for host managed-node2 49915 1727204313.34546: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 49915 1727204313.34550: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204313.34553: getting variables 49915 1727204313.34554: in VariableManager get_vars() 49915 1727204313.34590: Calling all_inventory to load vars for managed-node2 49915 1727204313.34593: Calling groups_inventory to load vars for managed-node2 49915 1727204313.34595: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204313.34606: Calling all_plugins_play to load vars for managed-node2 49915 1727204313.34609: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204313.34613: Calling groups_plugins_play to load vars for managed-node2 49915 1727204313.36462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204313.38369: done with get_vars() 49915 1727204313.38399: done getting variables 49915 1727204313.38457: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49915 1727204313.38574: variable 'profile' from source: include params 49915 1727204313.38579: variable 'item' from source: include params 49915 1727204313.38641: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-lsr101] ***************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:58:33 -0400 (0:00:00.053) 0:00:20.092 ***** 49915 1727204313.38672: entering _queue_task() for managed-node2/command 49915 1727204313.39024: worker is 1 (out of 1 available) 49915 1727204313.39037: exiting _queue_task() for managed-node2/command 49915 1727204313.39052: done queuing things up, now waiting for results queue to drain 49915 1727204313.39053: waiting for pending results... 49915 1727204313.39363: running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-lsr101 49915 1727204313.39434: in run() - task 028d2410-947f-dcd7-b5af-0000000007f8 49915 1727204313.39460: variable 'ansible_search_path' from source: unknown 49915 1727204313.39470: variable 'ansible_search_path' from source: unknown 49915 1727204313.39519: calling self._execute() 49915 1727204313.39614: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204313.39625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204313.39638: variable 'omit' from source: magic vars 49915 1727204313.40582: variable 'ansible_distribution_major_version' from source: facts 49915 1727204313.40586: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204313.40982: variable 'profile_stat' from source: set_fact 49915 1727204313.40985: Evaluated conditional (profile_stat.stat.exists): False 49915 1727204313.40988: when evaluation is False, skipping this task 49915 1727204313.40991: _execute() done 49915 1727204313.40993: dumping result to json 49915 1727204313.40995: done dumping result, returning 49915 1727204313.40998: done running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-lsr101 [028d2410-947f-dcd7-b5af-0000000007f8] 49915 1727204313.41000: sending task result for task 028d2410-947f-dcd7-b5af-0000000007f8 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 49915 1727204313.41135: no more pending results, returning what we have 49915 1727204313.41138: results queue empty 49915 1727204313.41139: checking for any_errors_fatal 49915 1727204313.41146: done checking for any_errors_fatal 49915 1727204313.41146: checking for max_fail_percentage 49915 1727204313.41148: done checking for max_fail_percentage 49915 1727204313.41150: checking to see if all hosts have failed and the running result is not ok 49915 1727204313.41151: done checking to see if all hosts have failed 49915 1727204313.41152: getting the remaining hosts for this loop 49915 1727204313.41153: done getting the remaining hosts for this loop 49915 1727204313.41157: getting the next task for host managed-node2 49915 1727204313.41165: done getting next task for host managed-node2 49915 1727204313.41168: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 49915 1727204313.41172: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204313.41178: getting variables 49915 1727204313.41179: in VariableManager get_vars() 49915 1727204313.41221: Calling all_inventory to load vars for managed-node2 49915 1727204313.41224: Calling groups_inventory to load vars for managed-node2 49915 1727204313.41226: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204313.41241: Calling all_plugins_play to load vars for managed-node2 49915 1727204313.41243: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204313.41247: Calling groups_plugins_play to load vars for managed-node2 49915 1727204313.41765: done sending task result for task 028d2410-947f-dcd7-b5af-0000000007f8 49915 1727204313.41768: WORKER PROCESS EXITING 49915 1727204313.43290: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204313.46551: done with get_vars() 49915 1727204313.46784: done getting variables 49915 1727204313.46843: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49915 1727204313.46952: variable 'profile' from source: include params 49915 1727204313.46956: variable 'item' from source: include params 49915 1727204313.47218: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-lsr101] ************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:58:33 -0400 (0:00:00.085) 0:00:20.178 ***** 49915 1727204313.47250: entering _queue_task() for managed-node2/set_fact 49915 1727204313.48030: worker is 1 (out of 1 available) 49915 1727204313.48045: exiting _queue_task() for managed-node2/set_fact 49915 1727204313.48058: done queuing things up, now waiting for results queue to drain 49915 1727204313.48059: waiting for pending results... 49915 1727204313.48627: running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-lsr101 49915 1727204313.48745: in run() - task 028d2410-947f-dcd7-b5af-0000000007f9 49915 1727204313.48910: variable 'ansible_search_path' from source: unknown 49915 1727204313.48917: variable 'ansible_search_path' from source: unknown 49915 1727204313.48933: calling self._execute() 49915 1727204313.49155: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204313.49300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204313.49305: variable 'omit' from source: magic vars 49915 1727204313.50264: variable 'ansible_distribution_major_version' from source: facts 49915 1727204313.50277: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204313.50470: variable 'profile_stat' from source: set_fact 49915 1727204313.50681: Evaluated conditional (profile_stat.stat.exists): False 49915 1727204313.50684: when evaluation is False, skipping this task 49915 1727204313.50686: _execute() done 49915 1727204313.50689: dumping result to json 49915 1727204313.50692: done dumping result, returning 49915 1727204313.50695: done running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-lsr101 [028d2410-947f-dcd7-b5af-0000000007f9] 49915 1727204313.50697: sending task result for task 028d2410-947f-dcd7-b5af-0000000007f9 49915 1727204313.50766: done sending task result for task 028d2410-947f-dcd7-b5af-0000000007f9 49915 1727204313.50770: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 49915 1727204313.50828: no more pending results, returning what we have 49915 1727204313.50833: results queue empty 49915 1727204313.50834: checking for any_errors_fatal 49915 1727204313.50841: done checking for any_errors_fatal 49915 1727204313.50841: checking for max_fail_percentage 49915 1727204313.50843: done checking for max_fail_percentage 49915 1727204313.50844: checking to see if all hosts have failed and the running result is not ok 49915 1727204313.50845: done checking to see if all hosts have failed 49915 1727204313.50846: getting the remaining hosts for this loop 49915 1727204313.50847: done getting the remaining hosts for this loop 49915 1727204313.50851: getting the next task for host managed-node2 49915 1727204313.50859: done getting next task for host managed-node2 49915 1727204313.50862: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 49915 1727204313.50865: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204313.50870: getting variables 49915 1727204313.50872: in VariableManager get_vars() 49915 1727204313.50922: Calling all_inventory to load vars for managed-node2 49915 1727204313.50926: Calling groups_inventory to load vars for managed-node2 49915 1727204313.50928: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204313.50942: Calling all_plugins_play to load vars for managed-node2 49915 1727204313.50945: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204313.50947: Calling groups_plugins_play to load vars for managed-node2 49915 1727204313.52466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204313.54321: done with get_vars() 49915 1727204313.54345: done getting variables 49915 1727204313.54413: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49915 1727204313.54532: variable 'profile' from source: include params 49915 1727204313.54536: variable 'item' from source: include params 49915 1727204313.54597: variable 'item' from source: include params TASK [Assert that the profile is present - 'lsr101'] *************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:58:33 -0400 (0:00:00.073) 0:00:20.252 ***** 49915 1727204313.54632: entering _queue_task() for managed-node2/assert 49915 1727204313.55003: worker is 1 (out of 1 available) 49915 1727204313.55016: exiting _queue_task() for managed-node2/assert 49915 1727204313.55027: done queuing things up, now waiting for results queue to drain 49915 1727204313.55029: waiting for pending results... 49915 1727204313.55591: running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'lsr101' 49915 1727204313.55600: in run() - task 028d2410-947f-dcd7-b5af-0000000006b9 49915 1727204313.55604: variable 'ansible_search_path' from source: unknown 49915 1727204313.55609: variable 'ansible_search_path' from source: unknown 49915 1727204313.55615: calling self._execute() 49915 1727204313.55668: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204313.55672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204313.55683: variable 'omit' from source: magic vars 49915 1727204313.56064: variable 'ansible_distribution_major_version' from source: facts 49915 1727204313.56077: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204313.56083: variable 'omit' from source: magic vars 49915 1727204313.56118: variable 'omit' from source: magic vars 49915 1727204313.56220: variable 'profile' from source: include params 49915 1727204313.56224: variable 'item' from source: include params 49915 1727204313.56354: variable 'item' from source: include params 49915 1727204313.56358: variable 'omit' from source: magic vars 49915 1727204313.56361: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204313.56388: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204313.56408: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204313.56425: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204313.56437: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204313.56467: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204313.56471: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204313.56473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204313.56572: Set connection var ansible_connection to ssh 49915 1727204313.56575: Set connection var ansible_shell_type to sh 49915 1727204313.56581: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204313.56591: Set connection var ansible_shell_executable to /bin/sh 49915 1727204313.56603: Set connection var ansible_timeout to 10 49915 1727204313.56610: Set connection var ansible_pipelining to False 49915 1727204313.56632: variable 'ansible_shell_executable' from source: unknown 49915 1727204313.56636: variable 'ansible_connection' from source: unknown 49915 1727204313.56638: variable 'ansible_module_compression' from source: unknown 49915 1727204313.56640: variable 'ansible_shell_type' from source: unknown 49915 1727204313.56643: variable 'ansible_shell_executable' from source: unknown 49915 1727204313.56645: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204313.56647: variable 'ansible_pipelining' from source: unknown 49915 1727204313.56649: variable 'ansible_timeout' from source: unknown 49915 1727204313.56654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204313.56802: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204313.56811: variable 'omit' from source: magic vars 49915 1727204313.56825: starting attempt loop 49915 1727204313.56828: running the handler 49915 1727204313.56936: variable 'lsr_net_profile_exists' from source: set_fact 49915 1727204313.56940: Evaluated conditional (lsr_net_profile_exists): True 49915 1727204313.56980: handler run complete 49915 1727204313.56983: attempt loop complete, returning result 49915 1727204313.56985: _execute() done 49915 1727204313.56988: dumping result to json 49915 1727204313.56990: done dumping result, returning 49915 1727204313.56993: done running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'lsr101' [028d2410-947f-dcd7-b5af-0000000006b9] 49915 1727204313.56995: sending task result for task 028d2410-947f-dcd7-b5af-0000000006b9 49915 1727204313.57069: done sending task result for task 028d2410-947f-dcd7-b5af-0000000006b9 49915 1727204313.57072: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 49915 1727204313.57125: no more pending results, returning what we have 49915 1727204313.57128: results queue empty 49915 1727204313.57129: checking for any_errors_fatal 49915 1727204313.57138: done checking for any_errors_fatal 49915 1727204313.57139: checking for max_fail_percentage 49915 1727204313.57141: done checking for max_fail_percentage 49915 1727204313.57142: checking to see if all hosts have failed and the running result is not ok 49915 1727204313.57144: done checking to see if all hosts have failed 49915 1727204313.57144: getting the remaining hosts for this loop 49915 1727204313.57146: done getting the remaining hosts for this loop 49915 1727204313.57150: getting the next task for host managed-node2 49915 1727204313.57157: done getting next task for host managed-node2 49915 1727204313.57159: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 49915 1727204313.57163: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204313.57167: getting variables 49915 1727204313.57169: in VariableManager get_vars() 49915 1727204313.57216: Calling all_inventory to load vars for managed-node2 49915 1727204313.57219: Calling groups_inventory to load vars for managed-node2 49915 1727204313.57222: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204313.57233: Calling all_plugins_play to load vars for managed-node2 49915 1727204313.57236: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204313.57238: Calling groups_plugins_play to load vars for managed-node2 49915 1727204313.59134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204313.61786: done with get_vars() 49915 1727204313.61808: done getting variables 49915 1727204313.61872: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49915 1727204313.61993: variable 'profile' from source: include params 49915 1727204313.61997: variable 'item' from source: include params 49915 1727204313.62060: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'lsr101'] ********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:58:33 -0400 (0:00:00.074) 0:00:20.327 ***** 49915 1727204313.62097: entering _queue_task() for managed-node2/assert 49915 1727204313.62424: worker is 1 (out of 1 available) 49915 1727204313.62436: exiting _queue_task() for managed-node2/assert 49915 1727204313.62448: done queuing things up, now waiting for results queue to drain 49915 1727204313.62450: waiting for pending results... 49915 1727204313.62793: running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'lsr101' 49915 1727204313.62917: in run() - task 028d2410-947f-dcd7-b5af-0000000006ba 49915 1727204313.62922: variable 'ansible_search_path' from source: unknown 49915 1727204313.62925: variable 'ansible_search_path' from source: unknown 49915 1727204313.62928: calling self._execute() 49915 1727204313.62980: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204313.62987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204313.62997: variable 'omit' from source: magic vars 49915 1727204313.63360: variable 'ansible_distribution_major_version' from source: facts 49915 1727204313.63371: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204313.63379: variable 'omit' from source: magic vars 49915 1727204313.63417: variable 'omit' from source: magic vars 49915 1727204313.63521: variable 'profile' from source: include params 49915 1727204313.63525: variable 'item' from source: include params 49915 1727204313.63672: variable 'item' from source: include params 49915 1727204313.63677: variable 'omit' from source: magic vars 49915 1727204313.63680: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204313.63692: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204313.63715: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204313.63730: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204313.63743: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204313.63778: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204313.63782: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204313.63791: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204313.63920: Set connection var ansible_connection to ssh 49915 1727204313.63924: Set connection var ansible_shell_type to sh 49915 1727204313.63931: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204313.63992: Set connection var ansible_shell_executable to /bin/sh 49915 1727204313.63995: Set connection var ansible_timeout to 10 49915 1727204313.63998: Set connection var ansible_pipelining to False 49915 1727204313.64000: variable 'ansible_shell_executable' from source: unknown 49915 1727204313.64002: variable 'ansible_connection' from source: unknown 49915 1727204313.64005: variable 'ansible_module_compression' from source: unknown 49915 1727204313.64036: variable 'ansible_shell_type' from source: unknown 49915 1727204313.64039: variable 'ansible_shell_executable' from source: unknown 49915 1727204313.64042: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204313.64044: variable 'ansible_pipelining' from source: unknown 49915 1727204313.64046: variable 'ansible_timeout' from source: unknown 49915 1727204313.64048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204313.64195: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204313.64208: variable 'omit' from source: magic vars 49915 1727204313.64214: starting attempt loop 49915 1727204313.64217: running the handler 49915 1727204313.64580: variable 'lsr_net_profile_ansible_managed' from source: set_fact 49915 1727204313.64583: Evaluated conditional (lsr_net_profile_ansible_managed): True 49915 1727204313.64585: handler run complete 49915 1727204313.64587: attempt loop complete, returning result 49915 1727204313.64589: _execute() done 49915 1727204313.64590: dumping result to json 49915 1727204313.64593: done dumping result, returning 49915 1727204313.64594: done running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'lsr101' [028d2410-947f-dcd7-b5af-0000000006ba] 49915 1727204313.64596: sending task result for task 028d2410-947f-dcd7-b5af-0000000006ba 49915 1727204313.64656: done sending task result for task 028d2410-947f-dcd7-b5af-0000000006ba 49915 1727204313.64659: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 49915 1727204313.64700: no more pending results, returning what we have 49915 1727204313.64703: results queue empty 49915 1727204313.64704: checking for any_errors_fatal 49915 1727204313.64709: done checking for any_errors_fatal 49915 1727204313.64710: checking for max_fail_percentage 49915 1727204313.64712: done checking for max_fail_percentage 49915 1727204313.64713: checking to see if all hosts have failed and the running result is not ok 49915 1727204313.64714: done checking to see if all hosts have failed 49915 1727204313.64714: getting the remaining hosts for this loop 49915 1727204313.64715: done getting the remaining hosts for this loop 49915 1727204313.64719: getting the next task for host managed-node2 49915 1727204313.64725: done getting next task for host managed-node2 49915 1727204313.64727: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 49915 1727204313.64730: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204313.64733: getting variables 49915 1727204313.64735: in VariableManager get_vars() 49915 1727204313.64847: Calling all_inventory to load vars for managed-node2 49915 1727204313.64850: Calling groups_inventory to load vars for managed-node2 49915 1727204313.64853: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204313.64864: Calling all_plugins_play to load vars for managed-node2 49915 1727204313.64867: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204313.64870: Calling groups_plugins_play to load vars for managed-node2 49915 1727204313.66722: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204313.68680: done with get_vars() 49915 1727204313.68714: done getting variables 49915 1727204313.68772: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49915 1727204313.68890: variable 'profile' from source: include params 49915 1727204313.68894: variable 'item' from source: include params 49915 1727204313.68956: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in lsr101] **************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:58:33 -0400 (0:00:00.068) 0:00:20.396 ***** 49915 1727204313.68994: entering _queue_task() for managed-node2/assert 49915 1727204313.69338: worker is 1 (out of 1 available) 49915 1727204313.69356: exiting _queue_task() for managed-node2/assert 49915 1727204313.69369: done queuing things up, now waiting for results queue to drain 49915 1727204313.69371: waiting for pending results... 49915 1727204313.69665: running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in lsr101 49915 1727204313.69707: in run() - task 028d2410-947f-dcd7-b5af-0000000006bb 49915 1727204313.69721: variable 'ansible_search_path' from source: unknown 49915 1727204313.69724: variable 'ansible_search_path' from source: unknown 49915 1727204313.69760: calling self._execute() 49915 1727204313.69853: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204313.69857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204313.69867: variable 'omit' from source: magic vars 49915 1727204313.70236: variable 'ansible_distribution_major_version' from source: facts 49915 1727204313.70247: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204313.70253: variable 'omit' from source: magic vars 49915 1727204313.70298: variable 'omit' from source: magic vars 49915 1727204313.70504: variable 'profile' from source: include params 49915 1727204313.70507: variable 'item' from source: include params 49915 1727204313.70546: variable 'item' from source: include params 49915 1727204313.70562: variable 'omit' from source: magic vars 49915 1727204313.70602: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204313.70637: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204313.70699: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204313.70702: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204313.70706: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204313.70738: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204313.70744: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204313.70746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204313.71046: Set connection var ansible_connection to ssh 49915 1727204313.71049: Set connection var ansible_shell_type to sh 49915 1727204313.71052: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204313.71055: Set connection var ansible_shell_executable to /bin/sh 49915 1727204313.71057: Set connection var ansible_timeout to 10 49915 1727204313.71059: Set connection var ansible_pipelining to False 49915 1727204313.71090: variable 'ansible_shell_executable' from source: unknown 49915 1727204313.71093: variable 'ansible_connection' from source: unknown 49915 1727204313.71096: variable 'ansible_module_compression' from source: unknown 49915 1727204313.71098: variable 'ansible_shell_type' from source: unknown 49915 1727204313.71101: variable 'ansible_shell_executable' from source: unknown 49915 1727204313.71103: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204313.71109: variable 'ansible_pipelining' from source: unknown 49915 1727204313.71114: variable 'ansible_timeout' from source: unknown 49915 1727204313.71121: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204313.71484: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204313.71495: variable 'omit' from source: magic vars 49915 1727204313.71503: starting attempt loop 49915 1727204313.71507: running the handler 49915 1727204313.71725: variable 'lsr_net_profile_fingerprint' from source: set_fact 49915 1727204313.71729: Evaluated conditional (lsr_net_profile_fingerprint): True 49915 1727204313.71735: handler run complete 49915 1727204313.71750: attempt loop complete, returning result 49915 1727204313.71753: _execute() done 49915 1727204313.71756: dumping result to json 49915 1727204313.71759: done dumping result, returning 49915 1727204313.71765: done running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in lsr101 [028d2410-947f-dcd7-b5af-0000000006bb] 49915 1727204313.71888: sending task result for task 028d2410-947f-dcd7-b5af-0000000006bb 49915 1727204313.72088: done sending task result for task 028d2410-947f-dcd7-b5af-0000000006bb 49915 1727204313.72093: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 49915 1727204313.72151: no more pending results, returning what we have 49915 1727204313.72154: results queue empty 49915 1727204313.72155: checking for any_errors_fatal 49915 1727204313.72161: done checking for any_errors_fatal 49915 1727204313.72161: checking for max_fail_percentage 49915 1727204313.72164: done checking for max_fail_percentage 49915 1727204313.72165: checking to see if all hosts have failed and the running result is not ok 49915 1727204313.72166: done checking to see if all hosts have failed 49915 1727204313.72167: getting the remaining hosts for this loop 49915 1727204313.72169: done getting the remaining hosts for this loop 49915 1727204313.72173: getting the next task for host managed-node2 49915 1727204313.72189: done getting next task for host managed-node2 49915 1727204313.72193: ^ task is: TASK: Include the task 'get_profile_stat.yml' 49915 1727204313.72197: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204313.72201: getting variables 49915 1727204313.72203: in VariableManager get_vars() 49915 1727204313.72250: Calling all_inventory to load vars for managed-node2 49915 1727204313.72253: Calling groups_inventory to load vars for managed-node2 49915 1727204313.72256: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204313.72268: Calling all_plugins_play to load vars for managed-node2 49915 1727204313.72271: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204313.72274: Calling groups_plugins_play to load vars for managed-node2 49915 1727204313.74028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204313.80155: done with get_vars() 49915 1727204313.80181: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:58:33 -0400 (0:00:00.112) 0:00:20.508 ***** 49915 1727204313.80260: entering _queue_task() for managed-node2/include_tasks 49915 1727204313.80615: worker is 1 (out of 1 available) 49915 1727204313.80880: exiting _queue_task() for managed-node2/include_tasks 49915 1727204313.80890: done queuing things up, now waiting for results queue to drain 49915 1727204313.80891: waiting for pending results... 49915 1727204313.80924: running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' 49915 1727204313.81077: in run() - task 028d2410-947f-dcd7-b5af-0000000006bf 49915 1727204313.81082: variable 'ansible_search_path' from source: unknown 49915 1727204313.81084: variable 'ansible_search_path' from source: unknown 49915 1727204313.81087: calling self._execute() 49915 1727204313.81168: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204313.81174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204313.81293: variable 'omit' from source: magic vars 49915 1727204313.81579: variable 'ansible_distribution_major_version' from source: facts 49915 1727204313.81591: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204313.81596: _execute() done 49915 1727204313.81599: dumping result to json 49915 1727204313.81602: done dumping result, returning 49915 1727204313.81608: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' [028d2410-947f-dcd7-b5af-0000000006bf] 49915 1727204313.81620: sending task result for task 028d2410-947f-dcd7-b5af-0000000006bf 49915 1727204313.81710: done sending task result for task 028d2410-947f-dcd7-b5af-0000000006bf 49915 1727204313.81715: WORKER PROCESS EXITING 49915 1727204313.81759: no more pending results, returning what we have 49915 1727204313.81766: in VariableManager get_vars() 49915 1727204313.81823: Calling all_inventory to load vars for managed-node2 49915 1727204313.81826: Calling groups_inventory to load vars for managed-node2 49915 1727204313.81828: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204313.81842: Calling all_plugins_play to load vars for managed-node2 49915 1727204313.81845: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204313.81848: Calling groups_plugins_play to load vars for managed-node2 49915 1727204313.83385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204313.85390: done with get_vars() 49915 1727204313.85420: variable 'ansible_search_path' from source: unknown 49915 1727204313.85421: variable 'ansible_search_path' from source: unknown 49915 1727204313.85460: we have included files to process 49915 1727204313.85462: generating all_blocks data 49915 1727204313.85464: done generating all_blocks data 49915 1727204313.85468: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 49915 1727204313.85469: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 49915 1727204313.85472: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 49915 1727204313.86435: done processing included file 49915 1727204313.86437: iterating over new_blocks loaded from include file 49915 1727204313.86439: in VariableManager get_vars() 49915 1727204313.86459: done with get_vars() 49915 1727204313.86461: filtering new block on tags 49915 1727204313.86490: done filtering new block on tags 49915 1727204313.86493: in VariableManager get_vars() 49915 1727204313.86511: done with get_vars() 49915 1727204313.86515: filtering new block on tags 49915 1727204313.86536: done filtering new block on tags 49915 1727204313.86537: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node2 49915 1727204313.86543: extending task lists for all hosts with included blocks 49915 1727204313.86723: done extending task lists 49915 1727204313.86725: done processing included files 49915 1727204313.86725: results queue empty 49915 1727204313.86726: checking for any_errors_fatal 49915 1727204313.86730: done checking for any_errors_fatal 49915 1727204313.86731: checking for max_fail_percentage 49915 1727204313.86732: done checking for max_fail_percentage 49915 1727204313.86733: checking to see if all hosts have failed and the running result is not ok 49915 1727204313.86734: done checking to see if all hosts have failed 49915 1727204313.86734: getting the remaining hosts for this loop 49915 1727204313.86735: done getting the remaining hosts for this loop 49915 1727204313.86738: getting the next task for host managed-node2 49915 1727204313.86742: done getting next task for host managed-node2 49915 1727204313.86744: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 49915 1727204313.86747: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204313.86749: getting variables 49915 1727204313.86750: in VariableManager get_vars() 49915 1727204313.86763: Calling all_inventory to load vars for managed-node2 49915 1727204313.86765: Calling groups_inventory to load vars for managed-node2 49915 1727204313.86767: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204313.86774: Calling all_plugins_play to load vars for managed-node2 49915 1727204313.86778: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204313.86781: Calling groups_plugins_play to load vars for managed-node2 49915 1727204313.88072: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204313.89670: done with get_vars() 49915 1727204313.89694: done getting variables 49915 1727204313.89744: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:58:33 -0400 (0:00:00.095) 0:00:20.603 ***** 49915 1727204313.89774: entering _queue_task() for managed-node2/set_fact 49915 1727204313.90151: worker is 1 (out of 1 available) 49915 1727204313.90168: exiting _queue_task() for managed-node2/set_fact 49915 1727204313.90185: done queuing things up, now waiting for results queue to drain 49915 1727204313.90187: waiting for pending results... 49915 1727204313.90448: running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag 49915 1727204313.90560: in run() - task 028d2410-947f-dcd7-b5af-000000000838 49915 1727204313.90602: variable 'ansible_search_path' from source: unknown 49915 1727204313.90606: variable 'ansible_search_path' from source: unknown 49915 1727204313.90618: calling self._execute() 49915 1727204313.90711: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204313.90717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204313.90782: variable 'omit' from source: magic vars 49915 1727204313.91098: variable 'ansible_distribution_major_version' from source: facts 49915 1727204313.91111: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204313.91117: variable 'omit' from source: magic vars 49915 1727204313.91170: variable 'omit' from source: magic vars 49915 1727204313.91205: variable 'omit' from source: magic vars 49915 1727204313.91243: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204313.91359: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204313.91362: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204313.91365: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204313.91367: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204313.91369: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204313.91372: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204313.91374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204313.91455: Set connection var ansible_connection to ssh 49915 1727204313.91459: Set connection var ansible_shell_type to sh 49915 1727204313.91462: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204313.91481: Set connection var ansible_shell_executable to /bin/sh 49915 1727204313.91492: Set connection var ansible_timeout to 10 49915 1727204313.91495: Set connection var ansible_pipelining to False 49915 1727204313.91519: variable 'ansible_shell_executable' from source: unknown 49915 1727204313.91522: variable 'ansible_connection' from source: unknown 49915 1727204313.91526: variable 'ansible_module_compression' from source: unknown 49915 1727204313.91528: variable 'ansible_shell_type' from source: unknown 49915 1727204313.91531: variable 'ansible_shell_executable' from source: unknown 49915 1727204313.91533: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204313.91536: variable 'ansible_pipelining' from source: unknown 49915 1727204313.91538: variable 'ansible_timeout' from source: unknown 49915 1727204313.91540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204313.91711: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204313.91718: variable 'omit' from source: magic vars 49915 1727204313.91721: starting attempt loop 49915 1727204313.91723: running the handler 49915 1727204313.91726: handler run complete 49915 1727204313.91728: attempt loop complete, returning result 49915 1727204313.91730: _execute() done 49915 1727204313.91733: dumping result to json 49915 1727204313.91735: done dumping result, returning 49915 1727204313.91737: done running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag [028d2410-947f-dcd7-b5af-000000000838] 49915 1727204313.91739: sending task result for task 028d2410-947f-dcd7-b5af-000000000838 49915 1727204313.91877: done sending task result for task 028d2410-947f-dcd7-b5af-000000000838 49915 1727204313.91880: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 49915 1727204313.91972: no more pending results, returning what we have 49915 1727204313.91974: results queue empty 49915 1727204313.91977: checking for any_errors_fatal 49915 1727204313.91978: done checking for any_errors_fatal 49915 1727204313.91979: checking for max_fail_percentage 49915 1727204313.91980: done checking for max_fail_percentage 49915 1727204313.91981: checking to see if all hosts have failed and the running result is not ok 49915 1727204313.91982: done checking to see if all hosts have failed 49915 1727204313.91983: getting the remaining hosts for this loop 49915 1727204313.91984: done getting the remaining hosts for this loop 49915 1727204313.91987: getting the next task for host managed-node2 49915 1727204313.91992: done getting next task for host managed-node2 49915 1727204313.91994: ^ task is: TASK: Stat profile file 49915 1727204313.91997: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204313.92000: getting variables 49915 1727204313.92004: in VariableManager get_vars() 49915 1727204313.92041: Calling all_inventory to load vars for managed-node2 49915 1727204313.92043: Calling groups_inventory to load vars for managed-node2 49915 1727204313.92045: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204313.92054: Calling all_plugins_play to load vars for managed-node2 49915 1727204313.92056: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204313.92058: Calling groups_plugins_play to load vars for managed-node2 49915 1727204313.93480: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204313.94943: done with get_vars() 49915 1727204313.94967: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:58:33 -0400 (0:00:00.052) 0:00:20.656 ***** 49915 1727204313.95064: entering _queue_task() for managed-node2/stat 49915 1727204313.95610: worker is 1 (out of 1 available) 49915 1727204313.95619: exiting _queue_task() for managed-node2/stat 49915 1727204313.95629: done queuing things up, now waiting for results queue to drain 49915 1727204313.95630: waiting for pending results... 49915 1727204313.95793: running TaskExecutor() for managed-node2/TASK: Stat profile file 49915 1727204313.95820: in run() - task 028d2410-947f-dcd7-b5af-000000000839 49915 1727204313.95835: variable 'ansible_search_path' from source: unknown 49915 1727204313.95840: variable 'ansible_search_path' from source: unknown 49915 1727204313.95884: calling self._execute() 49915 1727204313.95991: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204313.95995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204313.95998: variable 'omit' from source: magic vars 49915 1727204313.96428: variable 'ansible_distribution_major_version' from source: facts 49915 1727204313.96432: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204313.96435: variable 'omit' from source: magic vars 49915 1727204313.96459: variable 'omit' from source: magic vars 49915 1727204313.96558: variable 'profile' from source: include params 49915 1727204313.96561: variable 'item' from source: include params 49915 1727204313.96626: variable 'item' from source: include params 49915 1727204313.96817: variable 'omit' from source: magic vars 49915 1727204313.96821: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204313.96824: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204313.96827: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204313.96829: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204313.96834: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204313.96837: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204313.96839: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204313.96841: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204313.96930: Set connection var ansible_connection to ssh 49915 1727204313.96933: Set connection var ansible_shell_type to sh 49915 1727204313.96941: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204313.96962: Set connection var ansible_shell_executable to /bin/sh 49915 1727204313.96965: Set connection var ansible_timeout to 10 49915 1727204313.96974: Set connection var ansible_pipelining to False 49915 1727204313.97004: variable 'ansible_shell_executable' from source: unknown 49915 1727204313.97008: variable 'ansible_connection' from source: unknown 49915 1727204313.97010: variable 'ansible_module_compression' from source: unknown 49915 1727204313.97015: variable 'ansible_shell_type' from source: unknown 49915 1727204313.97018: variable 'ansible_shell_executable' from source: unknown 49915 1727204313.97020: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204313.97022: variable 'ansible_pipelining' from source: unknown 49915 1727204313.97024: variable 'ansible_timeout' from source: unknown 49915 1727204313.97027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204313.97345: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 49915 1727204313.97581: variable 'omit' from source: magic vars 49915 1727204313.97584: starting attempt loop 49915 1727204313.97587: running the handler 49915 1727204313.97589: _low_level_execute_command(): starting 49915 1727204313.97590: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49915 1727204313.98999: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204313.99070: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204313.99359: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204313.99606: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204314.01323: stdout chunk (state=3): >>>/root <<< 49915 1727204314.01470: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204314.01478: stdout chunk (state=3): >>><<< 49915 1727204314.01489: stderr chunk (state=3): >>><<< 49915 1727204314.01550: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204314.01566: _low_level_execute_command(): starting 49915 1727204314.01573: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204314.0155137-51303-180446388892371 `" && echo ansible-tmp-1727204314.0155137-51303-180446388892371="` echo /root/.ansible/tmp/ansible-tmp-1727204314.0155137-51303-180446388892371 `" ) && sleep 0' 49915 1727204314.02684: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204314.02869: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204314.03167: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204314.03249: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204314.05215: stdout chunk (state=3): >>>ansible-tmp-1727204314.0155137-51303-180446388892371=/root/.ansible/tmp/ansible-tmp-1727204314.0155137-51303-180446388892371 <<< 49915 1727204314.05364: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204314.05379: stdout chunk (state=3): >>><<< 49915 1727204314.05392: stderr chunk (state=3): >>><<< 49915 1727204314.05582: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204314.0155137-51303-180446388892371=/root/.ansible/tmp/ansible-tmp-1727204314.0155137-51303-180446388892371 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204314.05586: variable 'ansible_module_compression' from source: unknown 49915 1727204314.05720: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49915ogiz3nec/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 49915 1727204314.05880: variable 'ansible_facts' from source: unknown 49915 1727204314.06084: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204314.0155137-51303-180446388892371/AnsiballZ_stat.py 49915 1727204314.06906: Sending initial data 49915 1727204314.06909: Sent initial data (153 bytes) 49915 1727204314.07767: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204314.07771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204314.07891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204314.07906: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204314.07923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204314.07978: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204314.07990: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204314.08169: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204314.08425: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204314.10237: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49915 1727204314.10284: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49915 1727204314.10376: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49915ogiz3nec/tmp51byh8ej /root/.ansible/tmp/ansible-tmp-1727204314.0155137-51303-180446388892371/AnsiballZ_stat.py <<< 49915 1727204314.10435: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204314.0155137-51303-180446388892371/AnsiballZ_stat.py" <<< 49915 1727204314.10456: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49915ogiz3nec/tmp51byh8ej" to remote "/root/.ansible/tmp/ansible-tmp-1727204314.0155137-51303-180446388892371/AnsiballZ_stat.py" <<< 49915 1727204314.10467: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204314.0155137-51303-180446388892371/AnsiballZ_stat.py" <<< 49915 1727204314.11914: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204314.11919: stdout chunk (state=3): >>><<< 49915 1727204314.11926: stderr chunk (state=3): >>><<< 49915 1727204314.11965: done transferring module to remote 49915 1727204314.11976: _low_level_execute_command(): starting 49915 1727204314.11983: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204314.0155137-51303-180446388892371/ /root/.ansible/tmp/ansible-tmp-1727204314.0155137-51303-180446388892371/AnsiballZ_stat.py && sleep 0' 49915 1727204314.12671: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204314.12734: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204314.12751: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204314.12762: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204314.12863: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204314.14944: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204314.14947: stdout chunk (state=3): >>><<< 49915 1727204314.14953: stderr chunk (state=3): >>><<< 49915 1727204314.14956: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204314.14959: _low_level_execute_command(): starting 49915 1727204314.14961: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204314.0155137-51303-180446388892371/AnsiballZ_stat.py && sleep 0' 49915 1727204314.15968: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204314.15971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 49915 1727204314.15974: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204314.15979: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204314.15981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204314.16017: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204314.16021: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204314.16036: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204314.16183: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204314.34161: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr101.90", "follow": false, "checksum_algorithm": "sha1"}}} <<< 49915 1727204314.35500: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 49915 1727204314.35532: stderr chunk (state=3): >>><<< 49915 1727204314.35536: stdout chunk (state=3): >>><<< 49915 1727204314.35554: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr101.90", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 49915 1727204314.35578: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-lsr101.90', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204314.0155137-51303-180446388892371/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49915 1727204314.35589: _low_level_execute_command(): starting 49915 1727204314.35594: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204314.0155137-51303-180446388892371/ > /dev/null 2>&1 && sleep 0' 49915 1727204314.36050: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204314.36054: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration <<< 49915 1727204314.36057: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204314.36111: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204314.36151: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204314.36245: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204314.38144: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204314.38166: stderr chunk (state=3): >>><<< 49915 1727204314.38169: stdout chunk (state=3): >>><<< 49915 1727204314.38185: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204314.38195: handler run complete 49915 1727204314.38256: attempt loop complete, returning result 49915 1727204314.38260: _execute() done 49915 1727204314.38263: dumping result to json 49915 1727204314.38267: done dumping result, returning 49915 1727204314.38269: done running TaskExecutor() for managed-node2/TASK: Stat profile file [028d2410-947f-dcd7-b5af-000000000839] 49915 1727204314.38271: sending task result for task 028d2410-947f-dcd7-b5af-000000000839 49915 1727204314.38337: done sending task result for task 028d2410-947f-dcd7-b5af-000000000839 49915 1727204314.38340: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 49915 1727204314.38414: no more pending results, returning what we have 49915 1727204314.38418: results queue empty 49915 1727204314.38419: checking for any_errors_fatal 49915 1727204314.38424: done checking for any_errors_fatal 49915 1727204314.38425: checking for max_fail_percentage 49915 1727204314.38427: done checking for max_fail_percentage 49915 1727204314.38428: checking to see if all hosts have failed and the running result is not ok 49915 1727204314.38429: done checking to see if all hosts have failed 49915 1727204314.38430: getting the remaining hosts for this loop 49915 1727204314.38431: done getting the remaining hosts for this loop 49915 1727204314.38436: getting the next task for host managed-node2 49915 1727204314.38443: done getting next task for host managed-node2 49915 1727204314.38445: ^ task is: TASK: Set NM profile exist flag based on the profile files 49915 1727204314.38449: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204314.38453: getting variables 49915 1727204314.38455: in VariableManager get_vars() 49915 1727204314.38531: Calling all_inventory to load vars for managed-node2 49915 1727204314.38534: Calling groups_inventory to load vars for managed-node2 49915 1727204314.38536: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204314.38548: Calling all_plugins_play to load vars for managed-node2 49915 1727204314.38551: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204314.38554: Calling groups_plugins_play to load vars for managed-node2 49915 1727204314.40378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204314.41252: done with get_vars() 49915 1727204314.41269: done getting variables 49915 1727204314.41314: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:58:34 -0400 (0:00:00.462) 0:00:21.119 ***** 49915 1727204314.41339: entering _queue_task() for managed-node2/set_fact 49915 1727204314.41585: worker is 1 (out of 1 available) 49915 1727204314.41600: exiting _queue_task() for managed-node2/set_fact 49915 1727204314.41612: done queuing things up, now waiting for results queue to drain 49915 1727204314.41613: waiting for pending results... 49915 1727204314.41788: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files 49915 1727204314.41871: in run() - task 028d2410-947f-dcd7-b5af-00000000083a 49915 1727204314.41883: variable 'ansible_search_path' from source: unknown 49915 1727204314.41887: variable 'ansible_search_path' from source: unknown 49915 1727204314.41915: calling self._execute() 49915 1727204314.41989: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204314.41993: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204314.42005: variable 'omit' from source: magic vars 49915 1727204314.42594: variable 'ansible_distribution_major_version' from source: facts 49915 1727204314.42598: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204314.42600: variable 'profile_stat' from source: set_fact 49915 1727204314.42602: Evaluated conditional (profile_stat.stat.exists): False 49915 1727204314.42604: when evaluation is False, skipping this task 49915 1727204314.42605: _execute() done 49915 1727204314.42607: dumping result to json 49915 1727204314.42609: done dumping result, returning 49915 1727204314.42611: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files [028d2410-947f-dcd7-b5af-00000000083a] 49915 1727204314.42613: sending task result for task 028d2410-947f-dcd7-b5af-00000000083a 49915 1727204314.42669: done sending task result for task 028d2410-947f-dcd7-b5af-00000000083a 49915 1727204314.42671: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 49915 1727204314.42723: no more pending results, returning what we have 49915 1727204314.42727: results queue empty 49915 1727204314.42728: checking for any_errors_fatal 49915 1727204314.42737: done checking for any_errors_fatal 49915 1727204314.42737: checking for max_fail_percentage 49915 1727204314.42739: done checking for max_fail_percentage 49915 1727204314.42740: checking to see if all hosts have failed and the running result is not ok 49915 1727204314.42741: done checking to see if all hosts have failed 49915 1727204314.42741: getting the remaining hosts for this loop 49915 1727204314.42743: done getting the remaining hosts for this loop 49915 1727204314.42746: getting the next task for host managed-node2 49915 1727204314.42752: done getting next task for host managed-node2 49915 1727204314.42754: ^ task is: TASK: Get NM profile info 49915 1727204314.42757: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204314.42760: getting variables 49915 1727204314.42762: in VariableManager get_vars() 49915 1727204314.42798: Calling all_inventory to load vars for managed-node2 49915 1727204314.42800: Calling groups_inventory to load vars for managed-node2 49915 1727204314.42802: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204314.42811: Calling all_plugins_play to load vars for managed-node2 49915 1727204314.42814: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204314.42816: Calling groups_plugins_play to load vars for managed-node2 49915 1727204314.43845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204314.44825: done with get_vars() 49915 1727204314.44841: done getting variables 49915 1727204314.44884: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:58:34 -0400 (0:00:00.035) 0:00:21.155 ***** 49915 1727204314.44906: entering _queue_task() for managed-node2/shell 49915 1727204314.45132: worker is 1 (out of 1 available) 49915 1727204314.45145: exiting _queue_task() for managed-node2/shell 49915 1727204314.45157: done queuing things up, now waiting for results queue to drain 49915 1727204314.45158: waiting for pending results... 49915 1727204314.45333: running TaskExecutor() for managed-node2/TASK: Get NM profile info 49915 1727204314.45402: in run() - task 028d2410-947f-dcd7-b5af-00000000083b 49915 1727204314.45415: variable 'ansible_search_path' from source: unknown 49915 1727204314.45420: variable 'ansible_search_path' from source: unknown 49915 1727204314.45447: calling self._execute() 49915 1727204314.45523: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204314.45529: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204314.45537: variable 'omit' from source: magic vars 49915 1727204314.45821: variable 'ansible_distribution_major_version' from source: facts 49915 1727204314.45830: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204314.45836: variable 'omit' from source: magic vars 49915 1727204314.45865: variable 'omit' from source: magic vars 49915 1727204314.45941: variable 'profile' from source: include params 49915 1727204314.45944: variable 'item' from source: include params 49915 1727204314.45992: variable 'item' from source: include params 49915 1727204314.46007: variable 'omit' from source: magic vars 49915 1727204314.46048: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204314.46071: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204314.46088: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204314.46101: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204314.46112: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204314.46140: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204314.46143: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204314.46145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204314.46211: Set connection var ansible_connection to ssh 49915 1727204314.46214: Set connection var ansible_shell_type to sh 49915 1727204314.46222: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204314.46230: Set connection var ansible_shell_executable to /bin/sh 49915 1727204314.46234: Set connection var ansible_timeout to 10 49915 1727204314.46241: Set connection var ansible_pipelining to False 49915 1727204314.46259: variable 'ansible_shell_executable' from source: unknown 49915 1727204314.46263: variable 'ansible_connection' from source: unknown 49915 1727204314.46265: variable 'ansible_module_compression' from source: unknown 49915 1727204314.46268: variable 'ansible_shell_type' from source: unknown 49915 1727204314.46270: variable 'ansible_shell_executable' from source: unknown 49915 1727204314.46272: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204314.46274: variable 'ansible_pipelining' from source: unknown 49915 1727204314.46278: variable 'ansible_timeout' from source: unknown 49915 1727204314.46280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204314.46383: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204314.46393: variable 'omit' from source: magic vars 49915 1727204314.46398: starting attempt loop 49915 1727204314.46400: running the handler 49915 1727204314.46411: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204314.46429: _low_level_execute_command(): starting 49915 1727204314.46435: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49915 1727204314.46960: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204314.46965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204314.46968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204314.47009: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204314.47015: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204314.47030: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204314.47106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204314.48790: stdout chunk (state=3): >>>/root <<< 49915 1727204314.48893: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204314.48926: stderr chunk (state=3): >>><<< 49915 1727204314.48929: stdout chunk (state=3): >>><<< 49915 1727204314.48947: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204314.48957: _low_level_execute_command(): starting 49915 1727204314.48961: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204314.489451-51338-175633757338967 `" && echo ansible-tmp-1727204314.489451-51338-175633757338967="` echo /root/.ansible/tmp/ansible-tmp-1727204314.489451-51338-175633757338967 `" ) && sleep 0' 49915 1727204314.49371: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204314.49386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204314.49407: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204314.49410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204314.49463: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204314.49467: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204314.49547: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204314.51457: stdout chunk (state=3): >>>ansible-tmp-1727204314.489451-51338-175633757338967=/root/.ansible/tmp/ansible-tmp-1727204314.489451-51338-175633757338967 <<< 49915 1727204314.51561: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204314.51586: stderr chunk (state=3): >>><<< 49915 1727204314.51590: stdout chunk (state=3): >>><<< 49915 1727204314.51609: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204314.489451-51338-175633757338967=/root/.ansible/tmp/ansible-tmp-1727204314.489451-51338-175633757338967 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204314.51635: variable 'ansible_module_compression' from source: unknown 49915 1727204314.51672: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49915ogiz3nec/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 49915 1727204314.51707: variable 'ansible_facts' from source: unknown 49915 1727204314.51759: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204314.489451-51338-175633757338967/AnsiballZ_command.py 49915 1727204314.51854: Sending initial data 49915 1727204314.51857: Sent initial data (155 bytes) 49915 1727204314.52288: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204314.52292: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204314.52294: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204314.52297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204314.52344: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204314.52347: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204314.52421: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204314.53989: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49915 1727204314.54058: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49915 1727204314.54130: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49915ogiz3nec/tmpnzlaashr /root/.ansible/tmp/ansible-tmp-1727204314.489451-51338-175633757338967/AnsiballZ_command.py <<< 49915 1727204314.54133: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204314.489451-51338-175633757338967/AnsiballZ_command.py" <<< 49915 1727204314.54202: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49915ogiz3nec/tmpnzlaashr" to remote "/root/.ansible/tmp/ansible-tmp-1727204314.489451-51338-175633757338967/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204314.489451-51338-175633757338967/AnsiballZ_command.py" <<< 49915 1727204314.54841: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204314.54873: stderr chunk (state=3): >>><<< 49915 1727204314.54878: stdout chunk (state=3): >>><<< 49915 1727204314.54899: done transferring module to remote 49915 1727204314.54908: _low_level_execute_command(): starting 49915 1727204314.54912: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204314.489451-51338-175633757338967/ /root/.ansible/tmp/ansible-tmp-1727204314.489451-51338-175633757338967/AnsiballZ_command.py && sleep 0' 49915 1727204314.55332: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204314.55335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 49915 1727204314.55337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204314.55343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204314.55402: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204314.55404: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204314.55469: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204314.57246: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204314.57267: stderr chunk (state=3): >>><<< 49915 1727204314.57271: stdout chunk (state=3): >>><<< 49915 1727204314.57286: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204314.57289: _low_level_execute_command(): starting 49915 1727204314.57292: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204314.489451-51338-175633757338967/AnsiballZ_command.py && sleep 0' 49915 1727204314.57713: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204314.57717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 49915 1727204314.57719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204314.57721: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204314.57723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204314.57786: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204314.57788: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204314.57854: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204314.74949: stdout chunk (state=3): >>> {"changed": true, "stdout": "lsr101.90 /etc/NetworkManager/system-connections/lsr101.90.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr101.90 | grep /etc", "start": "2024-09-24 14:58:34.728254", "end": "2024-09-24 14:58:34.748118", "delta": "0:00:00.019864", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr101.90 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 49915 1727204314.76597: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 49915 1727204314.76627: stderr chunk (state=3): >>><<< 49915 1727204314.76631: stdout chunk (state=3): >>><<< 49915 1727204314.76647: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "lsr101.90 /etc/NetworkManager/system-connections/lsr101.90.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr101.90 | grep /etc", "start": "2024-09-24 14:58:34.728254", "end": "2024-09-24 14:58:34.748118", "delta": "0:00:00.019864", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr101.90 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 49915 1727204314.76675: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep lsr101.90 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204314.489451-51338-175633757338967/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49915 1727204314.76683: _low_level_execute_command(): starting 49915 1727204314.76688: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204314.489451-51338-175633757338967/ > /dev/null 2>&1 && sleep 0' 49915 1727204314.77134: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204314.77144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204314.77178: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204314.77182: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration <<< 49915 1727204314.77184: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204314.77186: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204314.77228: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204314.77240: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204314.77317: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204314.79173: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204314.79206: stderr chunk (state=3): >>><<< 49915 1727204314.79209: stdout chunk (state=3): >>><<< 49915 1727204314.79221: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204314.79227: handler run complete 49915 1727204314.79245: Evaluated conditional (False): False 49915 1727204314.79254: attempt loop complete, returning result 49915 1727204314.79256: _execute() done 49915 1727204314.79259: dumping result to json 49915 1727204314.79264: done dumping result, returning 49915 1727204314.79271: done running TaskExecutor() for managed-node2/TASK: Get NM profile info [028d2410-947f-dcd7-b5af-00000000083b] 49915 1727204314.79278: sending task result for task 028d2410-947f-dcd7-b5af-00000000083b 49915 1727204314.79378: done sending task result for task 028d2410-947f-dcd7-b5af-00000000083b 49915 1727204314.79381: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr101.90 | grep /etc", "delta": "0:00:00.019864", "end": "2024-09-24 14:58:34.748118", "rc": 0, "start": "2024-09-24 14:58:34.728254" } STDOUT: lsr101.90 /etc/NetworkManager/system-connections/lsr101.90.nmconnection 49915 1727204314.79448: no more pending results, returning what we have 49915 1727204314.79451: results queue empty 49915 1727204314.79452: checking for any_errors_fatal 49915 1727204314.79458: done checking for any_errors_fatal 49915 1727204314.79458: checking for max_fail_percentage 49915 1727204314.79461: done checking for max_fail_percentage 49915 1727204314.79461: checking to see if all hosts have failed and the running result is not ok 49915 1727204314.79463: done checking to see if all hosts have failed 49915 1727204314.79463: getting the remaining hosts for this loop 49915 1727204314.79465: done getting the remaining hosts for this loop 49915 1727204314.79468: getting the next task for host managed-node2 49915 1727204314.79485: done getting next task for host managed-node2 49915 1727204314.79488: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 49915 1727204314.79491: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204314.79495: getting variables 49915 1727204314.79496: in VariableManager get_vars() 49915 1727204314.79541: Calling all_inventory to load vars for managed-node2 49915 1727204314.79544: Calling groups_inventory to load vars for managed-node2 49915 1727204314.79546: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204314.79556: Calling all_plugins_play to load vars for managed-node2 49915 1727204314.79558: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204314.79560: Calling groups_plugins_play to load vars for managed-node2 49915 1727204314.80389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204314.81263: done with get_vars() 49915 1727204314.81280: done getting variables 49915 1727204314.81326: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:58:34 -0400 (0:00:00.364) 0:00:21.519 ***** 49915 1727204314.81351: entering _queue_task() for managed-node2/set_fact 49915 1727204314.81583: worker is 1 (out of 1 available) 49915 1727204314.81596: exiting _queue_task() for managed-node2/set_fact 49915 1727204314.81608: done queuing things up, now waiting for results queue to drain 49915 1727204314.81610: waiting for pending results... 49915 1727204314.81779: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 49915 1727204314.81853: in run() - task 028d2410-947f-dcd7-b5af-00000000083c 49915 1727204314.81865: variable 'ansible_search_path' from source: unknown 49915 1727204314.81868: variable 'ansible_search_path' from source: unknown 49915 1727204314.81898: calling self._execute() 49915 1727204314.81971: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204314.81979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204314.81988: variable 'omit' from source: magic vars 49915 1727204314.82259: variable 'ansible_distribution_major_version' from source: facts 49915 1727204314.82269: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204314.82358: variable 'nm_profile_exists' from source: set_fact 49915 1727204314.82370: Evaluated conditional (nm_profile_exists.rc == 0): True 49915 1727204314.82377: variable 'omit' from source: magic vars 49915 1727204314.82409: variable 'omit' from source: magic vars 49915 1727204314.82433: variable 'omit' from source: magic vars 49915 1727204314.82464: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204314.82494: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204314.82514: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204314.82526: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204314.82536: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204314.82560: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204314.82563: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204314.82565: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204314.82635: Set connection var ansible_connection to ssh 49915 1727204314.82638: Set connection var ansible_shell_type to sh 49915 1727204314.82644: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204314.82652: Set connection var ansible_shell_executable to /bin/sh 49915 1727204314.82657: Set connection var ansible_timeout to 10 49915 1727204314.82663: Set connection var ansible_pipelining to False 49915 1727204314.82681: variable 'ansible_shell_executable' from source: unknown 49915 1727204314.82684: variable 'ansible_connection' from source: unknown 49915 1727204314.82687: variable 'ansible_module_compression' from source: unknown 49915 1727204314.82689: variable 'ansible_shell_type' from source: unknown 49915 1727204314.82691: variable 'ansible_shell_executable' from source: unknown 49915 1727204314.82693: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204314.82697: variable 'ansible_pipelining' from source: unknown 49915 1727204314.82700: variable 'ansible_timeout' from source: unknown 49915 1727204314.82704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204314.82809: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204314.82820: variable 'omit' from source: magic vars 49915 1727204314.82830: starting attempt loop 49915 1727204314.82833: running the handler 49915 1727204314.82836: handler run complete 49915 1727204314.82847: attempt loop complete, returning result 49915 1727204314.82850: _execute() done 49915 1727204314.82852: dumping result to json 49915 1727204314.82854: done dumping result, returning 49915 1727204314.82861: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [028d2410-947f-dcd7-b5af-00000000083c] 49915 1727204314.82866: sending task result for task 028d2410-947f-dcd7-b5af-00000000083c 49915 1727204314.82948: done sending task result for task 028d2410-947f-dcd7-b5af-00000000083c 49915 1727204314.82951: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 49915 1727204314.83002: no more pending results, returning what we have 49915 1727204314.83005: results queue empty 49915 1727204314.83006: checking for any_errors_fatal 49915 1727204314.83015: done checking for any_errors_fatal 49915 1727204314.83015: checking for max_fail_percentage 49915 1727204314.83017: done checking for max_fail_percentage 49915 1727204314.83018: checking to see if all hosts have failed and the running result is not ok 49915 1727204314.83019: done checking to see if all hosts have failed 49915 1727204314.83020: getting the remaining hosts for this loop 49915 1727204314.83021: done getting the remaining hosts for this loop 49915 1727204314.83024: getting the next task for host managed-node2 49915 1727204314.83034: done getting next task for host managed-node2 49915 1727204314.83036: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 49915 1727204314.83040: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204314.83044: getting variables 49915 1727204314.83045: in VariableManager get_vars() 49915 1727204314.83081: Calling all_inventory to load vars for managed-node2 49915 1727204314.83083: Calling groups_inventory to load vars for managed-node2 49915 1727204314.83085: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204314.83094: Calling all_plugins_play to load vars for managed-node2 49915 1727204314.83097: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204314.83099: Calling groups_plugins_play to load vars for managed-node2 49915 1727204314.84001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204314.84858: done with get_vars() 49915 1727204314.84872: done getting variables 49915 1727204314.84919: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49915 1727204314.85002: variable 'profile' from source: include params 49915 1727204314.85005: variable 'item' from source: include params 49915 1727204314.85048: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-lsr101.90] ********************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:58:34 -0400 (0:00:00.037) 0:00:21.556 ***** 49915 1727204314.85077: entering _queue_task() for managed-node2/command 49915 1727204314.85303: worker is 1 (out of 1 available) 49915 1727204314.85318: exiting _queue_task() for managed-node2/command 49915 1727204314.85330: done queuing things up, now waiting for results queue to drain 49915 1727204314.85331: waiting for pending results... 49915 1727204314.85513: running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-lsr101.90 49915 1727204314.85596: in run() - task 028d2410-947f-dcd7-b5af-00000000083e 49915 1727204314.85606: variable 'ansible_search_path' from source: unknown 49915 1727204314.85610: variable 'ansible_search_path' from source: unknown 49915 1727204314.85639: calling self._execute() 49915 1727204314.85722: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204314.85726: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204314.85735: variable 'omit' from source: magic vars 49915 1727204314.86014: variable 'ansible_distribution_major_version' from source: facts 49915 1727204314.86026: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204314.86109: variable 'profile_stat' from source: set_fact 49915 1727204314.86123: Evaluated conditional (profile_stat.stat.exists): False 49915 1727204314.86126: when evaluation is False, skipping this task 49915 1727204314.86129: _execute() done 49915 1727204314.86131: dumping result to json 49915 1727204314.86134: done dumping result, returning 49915 1727204314.86140: done running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-lsr101.90 [028d2410-947f-dcd7-b5af-00000000083e] 49915 1727204314.86145: sending task result for task 028d2410-947f-dcd7-b5af-00000000083e 49915 1727204314.86225: done sending task result for task 028d2410-947f-dcd7-b5af-00000000083e 49915 1727204314.86227: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 49915 1727204314.86280: no more pending results, returning what we have 49915 1727204314.86283: results queue empty 49915 1727204314.86284: checking for any_errors_fatal 49915 1727204314.86289: done checking for any_errors_fatal 49915 1727204314.86290: checking for max_fail_percentage 49915 1727204314.86292: done checking for max_fail_percentage 49915 1727204314.86293: checking to see if all hosts have failed and the running result is not ok 49915 1727204314.86294: done checking to see if all hosts have failed 49915 1727204314.86294: getting the remaining hosts for this loop 49915 1727204314.86296: done getting the remaining hosts for this loop 49915 1727204314.86299: getting the next task for host managed-node2 49915 1727204314.86306: done getting next task for host managed-node2 49915 1727204314.86308: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 49915 1727204314.86312: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204314.86315: getting variables 49915 1727204314.86316: in VariableManager get_vars() 49915 1727204314.86352: Calling all_inventory to load vars for managed-node2 49915 1727204314.86354: Calling groups_inventory to load vars for managed-node2 49915 1727204314.86356: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204314.86366: Calling all_plugins_play to load vars for managed-node2 49915 1727204314.86368: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204314.86370: Calling groups_plugins_play to load vars for managed-node2 49915 1727204314.87144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204314.88020: done with get_vars() 49915 1727204314.88034: done getting variables 49915 1727204314.88077: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49915 1727204314.88151: variable 'profile' from source: include params 49915 1727204314.88154: variable 'item' from source: include params 49915 1727204314.88195: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-lsr101.90] ******************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:58:34 -0400 (0:00:00.031) 0:00:21.588 ***** 49915 1727204314.88220: entering _queue_task() for managed-node2/set_fact 49915 1727204314.88431: worker is 1 (out of 1 available) 49915 1727204314.88444: exiting _queue_task() for managed-node2/set_fact 49915 1727204314.88456: done queuing things up, now waiting for results queue to drain 49915 1727204314.88457: waiting for pending results... 49915 1727204314.88626: running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-lsr101.90 49915 1727204314.88703: in run() - task 028d2410-947f-dcd7-b5af-00000000083f 49915 1727204314.88717: variable 'ansible_search_path' from source: unknown 49915 1727204314.88720: variable 'ansible_search_path' from source: unknown 49915 1727204314.88747: calling self._execute() 49915 1727204314.88820: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204314.88826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204314.88834: variable 'omit' from source: magic vars 49915 1727204314.89095: variable 'ansible_distribution_major_version' from source: facts 49915 1727204314.89104: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204314.89191: variable 'profile_stat' from source: set_fact 49915 1727204314.89201: Evaluated conditional (profile_stat.stat.exists): False 49915 1727204314.89204: when evaluation is False, skipping this task 49915 1727204314.89206: _execute() done 49915 1727204314.89210: dumping result to json 49915 1727204314.89215: done dumping result, returning 49915 1727204314.89219: done running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-lsr101.90 [028d2410-947f-dcd7-b5af-00000000083f] 49915 1727204314.89229: sending task result for task 028d2410-947f-dcd7-b5af-00000000083f 49915 1727204314.89308: done sending task result for task 028d2410-947f-dcd7-b5af-00000000083f 49915 1727204314.89311: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 49915 1727204314.89373: no more pending results, returning what we have 49915 1727204314.89378: results queue empty 49915 1727204314.89379: checking for any_errors_fatal 49915 1727204314.89383: done checking for any_errors_fatal 49915 1727204314.89384: checking for max_fail_percentage 49915 1727204314.89386: done checking for max_fail_percentage 49915 1727204314.89386: checking to see if all hosts have failed and the running result is not ok 49915 1727204314.89387: done checking to see if all hosts have failed 49915 1727204314.89388: getting the remaining hosts for this loop 49915 1727204314.89389: done getting the remaining hosts for this loop 49915 1727204314.89392: getting the next task for host managed-node2 49915 1727204314.89399: done getting next task for host managed-node2 49915 1727204314.89401: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 49915 1727204314.89405: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204314.89408: getting variables 49915 1727204314.89409: in VariableManager get_vars() 49915 1727204314.89443: Calling all_inventory to load vars for managed-node2 49915 1727204314.89446: Calling groups_inventory to load vars for managed-node2 49915 1727204314.89448: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204314.89457: Calling all_plugins_play to load vars for managed-node2 49915 1727204314.89459: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204314.89461: Calling groups_plugins_play to load vars for managed-node2 49915 1727204314.90326: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204314.91187: done with get_vars() 49915 1727204314.91202: done getting variables 49915 1727204314.91247: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49915 1727204314.91322: variable 'profile' from source: include params 49915 1727204314.91325: variable 'item' from source: include params 49915 1727204314.91365: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-lsr101.90] ************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:58:34 -0400 (0:00:00.031) 0:00:21.620 ***** 49915 1727204314.91389: entering _queue_task() for managed-node2/command 49915 1727204314.91600: worker is 1 (out of 1 available) 49915 1727204314.91613: exiting _queue_task() for managed-node2/command 49915 1727204314.91626: done queuing things up, now waiting for results queue to drain 49915 1727204314.91627: waiting for pending results... 49915 1727204314.91799: running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-lsr101.90 49915 1727204314.91872: in run() - task 028d2410-947f-dcd7-b5af-000000000840 49915 1727204314.91883: variable 'ansible_search_path' from source: unknown 49915 1727204314.91887: variable 'ansible_search_path' from source: unknown 49915 1727204314.91919: calling self._execute() 49915 1727204314.91991: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204314.91994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204314.92005: variable 'omit' from source: magic vars 49915 1727204314.92269: variable 'ansible_distribution_major_version' from source: facts 49915 1727204314.92280: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204314.92367: variable 'profile_stat' from source: set_fact 49915 1727204314.92379: Evaluated conditional (profile_stat.stat.exists): False 49915 1727204314.92382: when evaluation is False, skipping this task 49915 1727204314.92385: _execute() done 49915 1727204314.92387: dumping result to json 49915 1727204314.92392: done dumping result, returning 49915 1727204314.92404: done running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-lsr101.90 [028d2410-947f-dcd7-b5af-000000000840] 49915 1727204314.92406: sending task result for task 028d2410-947f-dcd7-b5af-000000000840 49915 1727204314.92486: done sending task result for task 028d2410-947f-dcd7-b5af-000000000840 49915 1727204314.92488: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 49915 1727204314.92564: no more pending results, returning what we have 49915 1727204314.92567: results queue empty 49915 1727204314.92568: checking for any_errors_fatal 49915 1727204314.92572: done checking for any_errors_fatal 49915 1727204314.92573: checking for max_fail_percentage 49915 1727204314.92574: done checking for max_fail_percentage 49915 1727204314.92578: checking to see if all hosts have failed and the running result is not ok 49915 1727204314.92579: done checking to see if all hosts have failed 49915 1727204314.92580: getting the remaining hosts for this loop 49915 1727204314.92581: done getting the remaining hosts for this loop 49915 1727204314.92584: getting the next task for host managed-node2 49915 1727204314.92589: done getting next task for host managed-node2 49915 1727204314.92591: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 49915 1727204314.92595: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204314.92598: getting variables 49915 1727204314.92599: in VariableManager get_vars() 49915 1727204314.92640: Calling all_inventory to load vars for managed-node2 49915 1727204314.92642: Calling groups_inventory to load vars for managed-node2 49915 1727204314.92645: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204314.92653: Calling all_plugins_play to load vars for managed-node2 49915 1727204314.92656: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204314.92658: Calling groups_plugins_play to load vars for managed-node2 49915 1727204314.93407: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204314.94368: done with get_vars() 49915 1727204314.94386: done getting variables 49915 1727204314.94428: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49915 1727204314.94503: variable 'profile' from source: include params 49915 1727204314.94506: variable 'item' from source: include params 49915 1727204314.94545: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-lsr101.90] *********************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:58:34 -0400 (0:00:00.031) 0:00:21.651 ***** 49915 1727204314.94568: entering _queue_task() for managed-node2/set_fact 49915 1727204314.94781: worker is 1 (out of 1 available) 49915 1727204314.94794: exiting _queue_task() for managed-node2/set_fact 49915 1727204314.94807: done queuing things up, now waiting for results queue to drain 49915 1727204314.94808: waiting for pending results... 49915 1727204314.94972: running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-lsr101.90 49915 1727204314.95042: in run() - task 028d2410-947f-dcd7-b5af-000000000841 49915 1727204314.95054: variable 'ansible_search_path' from source: unknown 49915 1727204314.95057: variable 'ansible_search_path' from source: unknown 49915 1727204314.95086: calling self._execute() 49915 1727204314.95156: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204314.95161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204314.95169: variable 'omit' from source: magic vars 49915 1727204314.95433: variable 'ansible_distribution_major_version' from source: facts 49915 1727204314.95443: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204314.95529: variable 'profile_stat' from source: set_fact 49915 1727204314.95540: Evaluated conditional (profile_stat.stat.exists): False 49915 1727204314.95543: when evaluation is False, skipping this task 49915 1727204314.95545: _execute() done 49915 1727204314.95548: dumping result to json 49915 1727204314.95550: done dumping result, returning 49915 1727204314.95557: done running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-lsr101.90 [028d2410-947f-dcd7-b5af-000000000841] 49915 1727204314.95562: sending task result for task 028d2410-947f-dcd7-b5af-000000000841 49915 1727204314.95644: done sending task result for task 028d2410-947f-dcd7-b5af-000000000841 49915 1727204314.95646: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 49915 1727204314.95727: no more pending results, returning what we have 49915 1727204314.95730: results queue empty 49915 1727204314.95731: checking for any_errors_fatal 49915 1727204314.95735: done checking for any_errors_fatal 49915 1727204314.95736: checking for max_fail_percentage 49915 1727204314.95738: done checking for max_fail_percentage 49915 1727204314.95738: checking to see if all hosts have failed and the running result is not ok 49915 1727204314.95739: done checking to see if all hosts have failed 49915 1727204314.95740: getting the remaining hosts for this loop 49915 1727204314.95741: done getting the remaining hosts for this loop 49915 1727204314.95745: getting the next task for host managed-node2 49915 1727204314.95751: done getting next task for host managed-node2 49915 1727204314.95753: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 49915 1727204314.95756: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204314.95759: getting variables 49915 1727204314.95760: in VariableManager get_vars() 49915 1727204314.95796: Calling all_inventory to load vars for managed-node2 49915 1727204314.95798: Calling groups_inventory to load vars for managed-node2 49915 1727204314.95800: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204314.95809: Calling all_plugins_play to load vars for managed-node2 49915 1727204314.95811: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204314.95814: Calling groups_plugins_play to load vars for managed-node2 49915 1727204314.96542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204314.97408: done with get_vars() 49915 1727204314.97423: done getting variables 49915 1727204314.97466: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49915 1727204314.97541: variable 'profile' from source: include params 49915 1727204314.97544: variable 'item' from source: include params 49915 1727204314.97586: variable 'item' from source: include params TASK [Assert that the profile is present - 'lsr101.90'] ************************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:58:34 -0400 (0:00:00.030) 0:00:21.682 ***** 49915 1727204314.97606: entering _queue_task() for managed-node2/assert 49915 1727204314.97810: worker is 1 (out of 1 available) 49915 1727204314.97821: exiting _queue_task() for managed-node2/assert 49915 1727204314.97834: done queuing things up, now waiting for results queue to drain 49915 1727204314.97835: waiting for pending results... 49915 1727204314.97999: running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'lsr101.90' 49915 1727204314.98066: in run() - task 028d2410-947f-dcd7-b5af-0000000006c0 49915 1727204314.98079: variable 'ansible_search_path' from source: unknown 49915 1727204314.98083: variable 'ansible_search_path' from source: unknown 49915 1727204314.98112: calling self._execute() 49915 1727204314.98184: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204314.98188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204314.98196: variable 'omit' from source: magic vars 49915 1727204314.98451: variable 'ansible_distribution_major_version' from source: facts 49915 1727204314.98460: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204314.98466: variable 'omit' from source: magic vars 49915 1727204314.98491: variable 'omit' from source: magic vars 49915 1727204314.98563: variable 'profile' from source: include params 49915 1727204314.98567: variable 'item' from source: include params 49915 1727204314.98617: variable 'item' from source: include params 49915 1727204314.98634: variable 'omit' from source: magic vars 49915 1727204314.98664: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204314.98692: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204314.98708: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204314.98728: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204314.98737: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204314.98760: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204314.98763: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204314.98765: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204314.98833: Set connection var ansible_connection to ssh 49915 1727204314.98836: Set connection var ansible_shell_type to sh 49915 1727204314.98842: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204314.98851: Set connection var ansible_shell_executable to /bin/sh 49915 1727204314.98855: Set connection var ansible_timeout to 10 49915 1727204314.98862: Set connection var ansible_pipelining to False 49915 1727204314.98880: variable 'ansible_shell_executable' from source: unknown 49915 1727204314.98883: variable 'ansible_connection' from source: unknown 49915 1727204314.98885: variable 'ansible_module_compression' from source: unknown 49915 1727204314.98888: variable 'ansible_shell_type' from source: unknown 49915 1727204314.98890: variable 'ansible_shell_executable' from source: unknown 49915 1727204314.98894: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204314.98898: variable 'ansible_pipelining' from source: unknown 49915 1727204314.98900: variable 'ansible_timeout' from source: unknown 49915 1727204314.98904: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204314.99005: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204314.99013: variable 'omit' from source: magic vars 49915 1727204314.99021: starting attempt loop 49915 1727204314.99024: running the handler 49915 1727204314.99102: variable 'lsr_net_profile_exists' from source: set_fact 49915 1727204314.99106: Evaluated conditional (lsr_net_profile_exists): True 49915 1727204314.99112: handler run complete 49915 1727204314.99126: attempt loop complete, returning result 49915 1727204314.99128: _execute() done 49915 1727204314.99131: dumping result to json 49915 1727204314.99133: done dumping result, returning 49915 1727204314.99139: done running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'lsr101.90' [028d2410-947f-dcd7-b5af-0000000006c0] 49915 1727204314.99145: sending task result for task 028d2410-947f-dcd7-b5af-0000000006c0 49915 1727204314.99222: done sending task result for task 028d2410-947f-dcd7-b5af-0000000006c0 49915 1727204314.99225: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 49915 1727204314.99303: no more pending results, returning what we have 49915 1727204314.99306: results queue empty 49915 1727204314.99306: checking for any_errors_fatal 49915 1727204314.99311: done checking for any_errors_fatal 49915 1727204314.99312: checking for max_fail_percentage 49915 1727204314.99313: done checking for max_fail_percentage 49915 1727204314.99314: checking to see if all hosts have failed and the running result is not ok 49915 1727204314.99315: done checking to see if all hosts have failed 49915 1727204314.99316: getting the remaining hosts for this loop 49915 1727204314.99317: done getting the remaining hosts for this loop 49915 1727204314.99319: getting the next task for host managed-node2 49915 1727204314.99324: done getting next task for host managed-node2 49915 1727204314.99326: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 49915 1727204314.99329: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204314.99331: getting variables 49915 1727204314.99333: in VariableManager get_vars() 49915 1727204314.99367: Calling all_inventory to load vars for managed-node2 49915 1727204314.99369: Calling groups_inventory to load vars for managed-node2 49915 1727204314.99371: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204314.99381: Calling all_plugins_play to load vars for managed-node2 49915 1727204314.99384: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204314.99386: Calling groups_plugins_play to load vars for managed-node2 49915 1727204315.00248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204315.01101: done with get_vars() 49915 1727204315.01118: done getting variables 49915 1727204315.01161: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49915 1727204315.01235: variable 'profile' from source: include params 49915 1727204315.01238: variable 'item' from source: include params 49915 1727204315.01280: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'lsr101.90'] ******* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:58:35 -0400 (0:00:00.036) 0:00:21.719 ***** 49915 1727204315.01305: entering _queue_task() for managed-node2/assert 49915 1727204315.01508: worker is 1 (out of 1 available) 49915 1727204315.01523: exiting _queue_task() for managed-node2/assert 49915 1727204315.01535: done queuing things up, now waiting for results queue to drain 49915 1727204315.01536: waiting for pending results... 49915 1727204315.01702: running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'lsr101.90' 49915 1727204315.01768: in run() - task 028d2410-947f-dcd7-b5af-0000000006c1 49915 1727204315.01782: variable 'ansible_search_path' from source: unknown 49915 1727204315.01788: variable 'ansible_search_path' from source: unknown 49915 1727204315.01814: calling self._execute() 49915 1727204315.01885: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204315.01889: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204315.01899: variable 'omit' from source: magic vars 49915 1727204315.02160: variable 'ansible_distribution_major_version' from source: facts 49915 1727204315.02169: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204315.02174: variable 'omit' from source: magic vars 49915 1727204315.02204: variable 'omit' from source: magic vars 49915 1727204315.02273: variable 'profile' from source: include params 49915 1727204315.02278: variable 'item' from source: include params 49915 1727204315.02327: variable 'item' from source: include params 49915 1727204315.02343: variable 'omit' from source: magic vars 49915 1727204315.02374: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204315.02402: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204315.02422: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204315.02434: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204315.02445: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204315.02468: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204315.02471: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204315.02473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204315.02540: Set connection var ansible_connection to ssh 49915 1727204315.02543: Set connection var ansible_shell_type to sh 49915 1727204315.02549: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204315.02557: Set connection var ansible_shell_executable to /bin/sh 49915 1727204315.02563: Set connection var ansible_timeout to 10 49915 1727204315.02568: Set connection var ansible_pipelining to False 49915 1727204315.02585: variable 'ansible_shell_executable' from source: unknown 49915 1727204315.02588: variable 'ansible_connection' from source: unknown 49915 1727204315.02590: variable 'ansible_module_compression' from source: unknown 49915 1727204315.02593: variable 'ansible_shell_type' from source: unknown 49915 1727204315.02595: variable 'ansible_shell_executable' from source: unknown 49915 1727204315.02597: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204315.02602: variable 'ansible_pipelining' from source: unknown 49915 1727204315.02605: variable 'ansible_timeout' from source: unknown 49915 1727204315.02608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204315.02709: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204315.02720: variable 'omit' from source: magic vars 49915 1727204315.02725: starting attempt loop 49915 1727204315.02727: running the handler 49915 1727204315.02804: variable 'lsr_net_profile_ansible_managed' from source: set_fact 49915 1727204315.02808: Evaluated conditional (lsr_net_profile_ansible_managed): True 49915 1727204315.02813: handler run complete 49915 1727204315.02826: attempt loop complete, returning result 49915 1727204315.02829: _execute() done 49915 1727204315.02831: dumping result to json 49915 1727204315.02834: done dumping result, returning 49915 1727204315.02840: done running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'lsr101.90' [028d2410-947f-dcd7-b5af-0000000006c1] 49915 1727204315.02845: sending task result for task 028d2410-947f-dcd7-b5af-0000000006c1 49915 1727204315.02923: done sending task result for task 028d2410-947f-dcd7-b5af-0000000006c1 49915 1727204315.02926: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 49915 1727204315.03000: no more pending results, returning what we have 49915 1727204315.03003: results queue empty 49915 1727204315.03004: checking for any_errors_fatal 49915 1727204315.03009: done checking for any_errors_fatal 49915 1727204315.03010: checking for max_fail_percentage 49915 1727204315.03011: done checking for max_fail_percentage 49915 1727204315.03012: checking to see if all hosts have failed and the running result is not ok 49915 1727204315.03013: done checking to see if all hosts have failed 49915 1727204315.03014: getting the remaining hosts for this loop 49915 1727204315.03015: done getting the remaining hosts for this loop 49915 1727204315.03018: getting the next task for host managed-node2 49915 1727204315.03023: done getting next task for host managed-node2 49915 1727204315.03025: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 49915 1727204315.03028: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204315.03031: getting variables 49915 1727204315.03032: in VariableManager get_vars() 49915 1727204315.03064: Calling all_inventory to load vars for managed-node2 49915 1727204315.03067: Calling groups_inventory to load vars for managed-node2 49915 1727204315.03069: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204315.03081: Calling all_plugins_play to load vars for managed-node2 49915 1727204315.03083: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204315.03086: Calling groups_plugins_play to load vars for managed-node2 49915 1727204315.04002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204315.05546: done with get_vars() 49915 1727204315.05568: done getting variables 49915 1727204315.05632: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49915 1727204315.05744: variable 'profile' from source: include params 49915 1727204315.05748: variable 'item' from source: include params 49915 1727204315.05809: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in lsr101.90] ************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:58:35 -0400 (0:00:00.045) 0:00:21.764 ***** 49915 1727204315.05849: entering _queue_task() for managed-node2/assert 49915 1727204315.06155: worker is 1 (out of 1 available) 49915 1727204315.06168: exiting _queue_task() for managed-node2/assert 49915 1727204315.06182: done queuing things up, now waiting for results queue to drain 49915 1727204315.06184: waiting for pending results... 49915 1727204315.06561: running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in lsr101.90 49915 1727204315.06567: in run() - task 028d2410-947f-dcd7-b5af-0000000006c2 49915 1727204315.06588: variable 'ansible_search_path' from source: unknown 49915 1727204315.06592: variable 'ansible_search_path' from source: unknown 49915 1727204315.06630: calling self._execute() 49915 1727204315.06727: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204315.06768: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204315.06771: variable 'omit' from source: magic vars 49915 1727204315.07113: variable 'ansible_distribution_major_version' from source: facts 49915 1727204315.07133: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204315.07139: variable 'omit' from source: magic vars 49915 1727204315.07287: variable 'omit' from source: magic vars 49915 1727204315.07290: variable 'profile' from source: include params 49915 1727204315.07293: variable 'item' from source: include params 49915 1727204315.07363: variable 'item' from source: include params 49915 1727204315.07386: variable 'omit' from source: magic vars 49915 1727204315.07429: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204315.07471: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204315.07494: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204315.07511: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204315.07527: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204315.07558: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204315.07568: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204315.07571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204315.07677: Set connection var ansible_connection to ssh 49915 1727204315.07683: Set connection var ansible_shell_type to sh 49915 1727204315.07691: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204315.07703: Set connection var ansible_shell_executable to /bin/sh 49915 1727204315.07708: Set connection var ansible_timeout to 10 49915 1727204315.07721: Set connection var ansible_pipelining to False 49915 1727204315.07741: variable 'ansible_shell_executable' from source: unknown 49915 1727204315.07744: variable 'ansible_connection' from source: unknown 49915 1727204315.07747: variable 'ansible_module_compression' from source: unknown 49915 1727204315.07749: variable 'ansible_shell_type' from source: unknown 49915 1727204315.07751: variable 'ansible_shell_executable' from source: unknown 49915 1727204315.07753: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204315.07758: variable 'ansible_pipelining' from source: unknown 49915 1727204315.07761: variable 'ansible_timeout' from source: unknown 49915 1727204315.07763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204315.07923: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204315.07938: variable 'omit' from source: magic vars 49915 1727204315.07941: starting attempt loop 49915 1727204315.07943: running the handler 49915 1727204315.08054: variable 'lsr_net_profile_fingerprint' from source: set_fact 49915 1727204315.08057: Evaluated conditional (lsr_net_profile_fingerprint): True 49915 1727204315.08065: handler run complete 49915 1727204315.08082: attempt loop complete, returning result 49915 1727204315.08087: _execute() done 49915 1727204315.08090: dumping result to json 49915 1727204315.08092: done dumping result, returning 49915 1727204315.08095: done running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in lsr101.90 [028d2410-947f-dcd7-b5af-0000000006c2] 49915 1727204315.08157: sending task result for task 028d2410-947f-dcd7-b5af-0000000006c2 49915 1727204315.08285: done sending task result for task 028d2410-947f-dcd7-b5af-0000000006c2 49915 1727204315.08289: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 49915 1727204315.08342: no more pending results, returning what we have 49915 1727204315.08345: results queue empty 49915 1727204315.08347: checking for any_errors_fatal 49915 1727204315.08351: done checking for any_errors_fatal 49915 1727204315.08352: checking for max_fail_percentage 49915 1727204315.08354: done checking for max_fail_percentage 49915 1727204315.08355: checking to see if all hosts have failed and the running result is not ok 49915 1727204315.08356: done checking to see if all hosts have failed 49915 1727204315.08357: getting the remaining hosts for this loop 49915 1727204315.08358: done getting the remaining hosts for this loop 49915 1727204315.08362: getting the next task for host managed-node2 49915 1727204315.08370: done getting next task for host managed-node2 49915 1727204315.08373: ^ task is: TASK: TEARDOWN: remove profiles. 49915 1727204315.08376: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204315.08483: getting variables 49915 1727204315.08485: in VariableManager get_vars() 49915 1727204315.08526: Calling all_inventory to load vars for managed-node2 49915 1727204315.08529: Calling groups_inventory to load vars for managed-node2 49915 1727204315.08531: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204315.08541: Calling all_plugins_play to load vars for managed-node2 49915 1727204315.08544: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204315.08547: Calling groups_plugins_play to load vars for managed-node2 49915 1727204315.09605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204315.10473: done with get_vars() 49915 1727204315.10490: done getting variables 49915 1727204315.10535: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [TEARDOWN: remove profiles.] ********************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:58 Tuesday 24 September 2024 14:58:35 -0400 (0:00:00.047) 0:00:21.811 ***** 49915 1727204315.10555: entering _queue_task() for managed-node2/debug 49915 1727204315.10790: worker is 1 (out of 1 available) 49915 1727204315.10804: exiting _queue_task() for managed-node2/debug 49915 1727204315.10818: done queuing things up, now waiting for results queue to drain 49915 1727204315.10820: waiting for pending results... 49915 1727204315.10997: running TaskExecutor() for managed-node2/TASK: TEARDOWN: remove profiles. 49915 1727204315.11061: in run() - task 028d2410-947f-dcd7-b5af-00000000005d 49915 1727204315.11073: variable 'ansible_search_path' from source: unknown 49915 1727204315.11106: calling self._execute() 49915 1727204315.11184: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204315.11189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204315.11198: variable 'omit' from source: magic vars 49915 1727204315.11497: variable 'ansible_distribution_major_version' from source: facts 49915 1727204315.11508: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204315.11514: variable 'omit' from source: magic vars 49915 1727204315.11532: variable 'omit' from source: magic vars 49915 1727204315.11559: variable 'omit' from source: magic vars 49915 1727204315.11596: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204315.11625: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204315.11641: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204315.11655: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204315.11666: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204315.11690: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204315.11695: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204315.11698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204315.11766: Set connection var ansible_connection to ssh 49915 1727204315.11770: Set connection var ansible_shell_type to sh 49915 1727204315.11772: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204315.11787: Set connection var ansible_shell_executable to /bin/sh 49915 1727204315.11790: Set connection var ansible_timeout to 10 49915 1727204315.11797: Set connection var ansible_pipelining to False 49915 1727204315.11814: variable 'ansible_shell_executable' from source: unknown 49915 1727204315.11820: variable 'ansible_connection' from source: unknown 49915 1727204315.11823: variable 'ansible_module_compression' from source: unknown 49915 1727204315.11826: variable 'ansible_shell_type' from source: unknown 49915 1727204315.11828: variable 'ansible_shell_executable' from source: unknown 49915 1727204315.11830: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204315.11834: variable 'ansible_pipelining' from source: unknown 49915 1727204315.11836: variable 'ansible_timeout' from source: unknown 49915 1727204315.11840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204315.11948: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204315.11956: variable 'omit' from source: magic vars 49915 1727204315.11959: starting attempt loop 49915 1727204315.11962: running the handler 49915 1727204315.12006: handler run complete 49915 1727204315.12019: attempt loop complete, returning result 49915 1727204315.12023: _execute() done 49915 1727204315.12028: dumping result to json 49915 1727204315.12030: done dumping result, returning 49915 1727204315.12033: done running TaskExecutor() for managed-node2/TASK: TEARDOWN: remove profiles. [028d2410-947f-dcd7-b5af-00000000005d] 49915 1727204315.12040: sending task result for task 028d2410-947f-dcd7-b5af-00000000005d 49915 1727204315.12117: done sending task result for task 028d2410-947f-dcd7-b5af-00000000005d 49915 1727204315.12120: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: ################################################## 49915 1727204315.12190: no more pending results, returning what we have 49915 1727204315.12193: results queue empty 49915 1727204315.12194: checking for any_errors_fatal 49915 1727204315.12200: done checking for any_errors_fatal 49915 1727204315.12201: checking for max_fail_percentage 49915 1727204315.12203: done checking for max_fail_percentage 49915 1727204315.12203: checking to see if all hosts have failed and the running result is not ok 49915 1727204315.12205: done checking to see if all hosts have failed 49915 1727204315.12205: getting the remaining hosts for this loop 49915 1727204315.12207: done getting the remaining hosts for this loop 49915 1727204315.12210: getting the next task for host managed-node2 49915 1727204315.12217: done getting next task for host managed-node2 49915 1727204315.12222: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 49915 1727204315.12225: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204315.12243: getting variables 49915 1727204315.12245: in VariableManager get_vars() 49915 1727204315.12281: Calling all_inventory to load vars for managed-node2 49915 1727204315.12284: Calling groups_inventory to load vars for managed-node2 49915 1727204315.12286: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204315.12294: Calling all_plugins_play to load vars for managed-node2 49915 1727204315.12296: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204315.12299: Calling groups_plugins_play to load vars for managed-node2 49915 1727204315.13081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204315.14236: done with get_vars() 49915 1727204315.14258: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:58:35 -0400 (0:00:00.037) 0:00:21.849 ***** 49915 1727204315.14351: entering _queue_task() for managed-node2/include_tasks 49915 1727204315.14661: worker is 1 (out of 1 available) 49915 1727204315.14678: exiting _queue_task() for managed-node2/include_tasks 49915 1727204315.14691: done queuing things up, now waiting for results queue to drain 49915 1727204315.14693: waiting for pending results... 49915 1727204315.14887: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 49915 1727204315.14984: in run() - task 028d2410-947f-dcd7-b5af-000000000065 49915 1727204315.14996: variable 'ansible_search_path' from source: unknown 49915 1727204315.15000: variable 'ansible_search_path' from source: unknown 49915 1727204315.15030: calling self._execute() 49915 1727204315.15097: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204315.15101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204315.15111: variable 'omit' from source: magic vars 49915 1727204315.15378: variable 'ansible_distribution_major_version' from source: facts 49915 1727204315.15387: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204315.15394: _execute() done 49915 1727204315.15397: dumping result to json 49915 1727204315.15401: done dumping result, returning 49915 1727204315.15415: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [028d2410-947f-dcd7-b5af-000000000065] 49915 1727204315.15418: sending task result for task 028d2410-947f-dcd7-b5af-000000000065 49915 1727204315.15497: done sending task result for task 028d2410-947f-dcd7-b5af-000000000065 49915 1727204315.15500: WORKER PROCESS EXITING 49915 1727204315.15552: no more pending results, returning what we have 49915 1727204315.15557: in VariableManager get_vars() 49915 1727204315.15604: Calling all_inventory to load vars for managed-node2 49915 1727204315.15606: Calling groups_inventory to load vars for managed-node2 49915 1727204315.15609: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204315.15620: Calling all_plugins_play to load vars for managed-node2 49915 1727204315.15624: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204315.15627: Calling groups_plugins_play to load vars for managed-node2 49915 1727204315.16494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204315.17820: done with get_vars() 49915 1727204315.17845: variable 'ansible_search_path' from source: unknown 49915 1727204315.17847: variable 'ansible_search_path' from source: unknown 49915 1727204315.17891: we have included files to process 49915 1727204315.17893: generating all_blocks data 49915 1727204315.17895: done generating all_blocks data 49915 1727204315.17901: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 49915 1727204315.17902: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 49915 1727204315.17904: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 49915 1727204315.18486: done processing included file 49915 1727204315.18488: iterating over new_blocks loaded from include file 49915 1727204315.18490: in VariableManager get_vars() 49915 1727204315.18514: done with get_vars() 49915 1727204315.18516: filtering new block on tags 49915 1727204315.18531: done filtering new block on tags 49915 1727204315.18534: in VariableManager get_vars() 49915 1727204315.18555: done with get_vars() 49915 1727204315.18557: filtering new block on tags 49915 1727204315.18577: done filtering new block on tags 49915 1727204315.18580: in VariableManager get_vars() 49915 1727204315.18601: done with get_vars() 49915 1727204315.18602: filtering new block on tags 49915 1727204315.18618: done filtering new block on tags 49915 1727204315.18620: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 49915 1727204315.18626: extending task lists for all hosts with included blocks 49915 1727204315.19314: done extending task lists 49915 1727204315.19315: done processing included files 49915 1727204315.19316: results queue empty 49915 1727204315.19317: checking for any_errors_fatal 49915 1727204315.19319: done checking for any_errors_fatal 49915 1727204315.19320: checking for max_fail_percentage 49915 1727204315.19321: done checking for max_fail_percentage 49915 1727204315.19322: checking to see if all hosts have failed and the running result is not ok 49915 1727204315.19323: done checking to see if all hosts have failed 49915 1727204315.19323: getting the remaining hosts for this loop 49915 1727204315.19324: done getting the remaining hosts for this loop 49915 1727204315.19327: getting the next task for host managed-node2 49915 1727204315.19331: done getting next task for host managed-node2 49915 1727204315.19333: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 49915 1727204315.19336: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204315.19345: getting variables 49915 1727204315.19346: in VariableManager get_vars() 49915 1727204315.19361: Calling all_inventory to load vars for managed-node2 49915 1727204315.19363: Calling groups_inventory to load vars for managed-node2 49915 1727204315.19364: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204315.19370: Calling all_plugins_play to load vars for managed-node2 49915 1727204315.19372: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204315.19376: Calling groups_plugins_play to load vars for managed-node2 49915 1727204315.20605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204315.22097: done with get_vars() 49915 1727204315.22121: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:58:35 -0400 (0:00:00.078) 0:00:21.928 ***** 49915 1727204315.22201: entering _queue_task() for managed-node2/setup 49915 1727204315.22540: worker is 1 (out of 1 available) 49915 1727204315.22552: exiting _queue_task() for managed-node2/setup 49915 1727204315.22564: done queuing things up, now waiting for results queue to drain 49915 1727204315.22566: waiting for pending results... 49915 1727204315.22774: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 49915 1727204315.22889: in run() - task 028d2410-947f-dcd7-b5af-000000000883 49915 1727204315.22921: variable 'ansible_search_path' from source: unknown 49915 1727204315.22925: variable 'ansible_search_path' from source: unknown 49915 1727204315.22950: calling self._execute() 49915 1727204315.23055: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204315.23059: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204315.23062: variable 'omit' from source: magic vars 49915 1727204315.23399: variable 'ansible_distribution_major_version' from source: facts 49915 1727204315.23484: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204315.23619: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49915 1727204315.25488: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49915 1727204315.25565: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49915 1727204315.25586: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49915 1727204315.25611: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49915 1727204315.25632: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49915 1727204315.25694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204315.25717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204315.25733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204315.25764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204315.25774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204315.25816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204315.25831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204315.25847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204315.25878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204315.25895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204315.26002: variable '__network_required_facts' from source: role '' defaults 49915 1727204315.26010: variable 'ansible_facts' from source: unknown 49915 1727204315.26448: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 49915 1727204315.26452: when evaluation is False, skipping this task 49915 1727204315.26455: _execute() done 49915 1727204315.26458: dumping result to json 49915 1727204315.26460: done dumping result, returning 49915 1727204315.26482: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [028d2410-947f-dcd7-b5af-000000000883] 49915 1727204315.26485: sending task result for task 028d2410-947f-dcd7-b5af-000000000883 49915 1727204315.26570: done sending task result for task 028d2410-947f-dcd7-b5af-000000000883 49915 1727204315.26573: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 49915 1727204315.26664: no more pending results, returning what we have 49915 1727204315.26668: results queue empty 49915 1727204315.26669: checking for any_errors_fatal 49915 1727204315.26670: done checking for any_errors_fatal 49915 1727204315.26671: checking for max_fail_percentage 49915 1727204315.26672: done checking for max_fail_percentage 49915 1727204315.26673: checking to see if all hosts have failed and the running result is not ok 49915 1727204315.26674: done checking to see if all hosts have failed 49915 1727204315.26677: getting the remaining hosts for this loop 49915 1727204315.26678: done getting the remaining hosts for this loop 49915 1727204315.26682: getting the next task for host managed-node2 49915 1727204315.26690: done getting next task for host managed-node2 49915 1727204315.26694: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 49915 1727204315.26698: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204315.26719: getting variables 49915 1727204315.26721: in VariableManager get_vars() 49915 1727204315.26758: Calling all_inventory to load vars for managed-node2 49915 1727204315.26760: Calling groups_inventory to load vars for managed-node2 49915 1727204315.26762: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204315.26771: Calling all_plugins_play to load vars for managed-node2 49915 1727204315.26773: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204315.26777: Calling groups_plugins_play to load vars for managed-node2 49915 1727204315.28395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204315.30695: done with get_vars() 49915 1727204315.30724: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:58:35 -0400 (0:00:00.086) 0:00:22.014 ***** 49915 1727204315.30828: entering _queue_task() for managed-node2/stat 49915 1727204315.31151: worker is 1 (out of 1 available) 49915 1727204315.31163: exiting _queue_task() for managed-node2/stat 49915 1727204315.31177: done queuing things up, now waiting for results queue to drain 49915 1727204315.31179: waiting for pending results... 49915 1727204315.31662: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 49915 1727204315.31668: in run() - task 028d2410-947f-dcd7-b5af-000000000885 49915 1727204315.31671: variable 'ansible_search_path' from source: unknown 49915 1727204315.31674: variable 'ansible_search_path' from source: unknown 49915 1727204315.31678: calling self._execute() 49915 1727204315.31748: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204315.31759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204315.31763: variable 'omit' from source: magic vars 49915 1727204315.32161: variable 'ansible_distribution_major_version' from source: facts 49915 1727204315.32173: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204315.32418: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49915 1727204315.32931: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49915 1727204315.32973: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49915 1727204315.33122: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49915 1727204315.33156: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49915 1727204315.33381: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 49915 1727204315.33481: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 49915 1727204315.33485: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204315.33488: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 49915 1727204315.33583: variable '__network_is_ostree' from source: set_fact 49915 1727204315.33593: Evaluated conditional (not __network_is_ostree is defined): False 49915 1727204315.33597: when evaluation is False, skipping this task 49915 1727204315.33599: _execute() done 49915 1727204315.33691: dumping result to json 49915 1727204315.33694: done dumping result, returning 49915 1727204315.33696: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [028d2410-947f-dcd7-b5af-000000000885] 49915 1727204315.33698: sending task result for task 028d2410-947f-dcd7-b5af-000000000885 49915 1727204315.33862: done sending task result for task 028d2410-947f-dcd7-b5af-000000000885 49915 1727204315.33866: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 49915 1727204315.33914: no more pending results, returning what we have 49915 1727204315.33918: results queue empty 49915 1727204315.33919: checking for any_errors_fatal 49915 1727204315.33925: done checking for any_errors_fatal 49915 1727204315.33926: checking for max_fail_percentage 49915 1727204315.33928: done checking for max_fail_percentage 49915 1727204315.33929: checking to see if all hosts have failed and the running result is not ok 49915 1727204315.33930: done checking to see if all hosts have failed 49915 1727204315.33931: getting the remaining hosts for this loop 49915 1727204315.33932: done getting the remaining hosts for this loop 49915 1727204315.33935: getting the next task for host managed-node2 49915 1727204315.33941: done getting next task for host managed-node2 49915 1727204315.33945: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 49915 1727204315.33948: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204315.33964: getting variables 49915 1727204315.33965: in VariableManager get_vars() 49915 1727204315.34005: Calling all_inventory to load vars for managed-node2 49915 1727204315.34008: Calling groups_inventory to load vars for managed-node2 49915 1727204315.34011: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204315.34019: Calling all_plugins_play to load vars for managed-node2 49915 1727204315.34022: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204315.34025: Calling groups_plugins_play to load vars for managed-node2 49915 1727204315.41323: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204315.42849: done with get_vars() 49915 1727204315.42875: done getting variables 49915 1727204315.42928: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:58:35 -0400 (0:00:00.121) 0:00:22.135 ***** 49915 1727204315.42961: entering _queue_task() for managed-node2/set_fact 49915 1727204315.43311: worker is 1 (out of 1 available) 49915 1727204315.43325: exiting _queue_task() for managed-node2/set_fact 49915 1727204315.43336: done queuing things up, now waiting for results queue to drain 49915 1727204315.43337: waiting for pending results... 49915 1727204315.43703: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 49915 1727204315.43781: in run() - task 028d2410-947f-dcd7-b5af-000000000886 49915 1727204315.43810: variable 'ansible_search_path' from source: unknown 49915 1727204315.43818: variable 'ansible_search_path' from source: unknown 49915 1727204315.43883: calling self._execute() 49915 1727204315.43927: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204315.43938: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204315.43947: variable 'omit' from source: magic vars 49915 1727204315.44318: variable 'ansible_distribution_major_version' from source: facts 49915 1727204315.44353: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204315.44631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49915 1727204315.44789: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49915 1727204315.44817: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49915 1727204315.44900: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49915 1727204315.44920: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49915 1727204315.45008: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 49915 1727204315.45035: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 49915 1727204315.45064: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204315.45086: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 49915 1727204315.45178: variable '__network_is_ostree' from source: set_fact 49915 1727204315.45189: Evaluated conditional (not __network_is_ostree is defined): False 49915 1727204315.45192: when evaluation is False, skipping this task 49915 1727204315.45195: _execute() done 49915 1727204315.45197: dumping result to json 49915 1727204315.45200: done dumping result, returning 49915 1727204315.45203: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [028d2410-947f-dcd7-b5af-000000000886] 49915 1727204315.45211: sending task result for task 028d2410-947f-dcd7-b5af-000000000886 49915 1727204315.45455: done sending task result for task 028d2410-947f-dcd7-b5af-000000000886 49915 1727204315.45458: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 49915 1727204315.45503: no more pending results, returning what we have 49915 1727204315.45507: results queue empty 49915 1727204315.45508: checking for any_errors_fatal 49915 1727204315.45514: done checking for any_errors_fatal 49915 1727204315.45514: checking for max_fail_percentage 49915 1727204315.45516: done checking for max_fail_percentage 49915 1727204315.45517: checking to see if all hosts have failed and the running result is not ok 49915 1727204315.45518: done checking to see if all hosts have failed 49915 1727204315.45519: getting the remaining hosts for this loop 49915 1727204315.45520: done getting the remaining hosts for this loop 49915 1727204315.45524: getting the next task for host managed-node2 49915 1727204315.45533: done getting next task for host managed-node2 49915 1727204315.45537: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 49915 1727204315.45541: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204315.45557: getting variables 49915 1727204315.45559: in VariableManager get_vars() 49915 1727204315.45599: Calling all_inventory to load vars for managed-node2 49915 1727204315.45602: Calling groups_inventory to load vars for managed-node2 49915 1727204315.45604: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204315.45613: Calling all_plugins_play to load vars for managed-node2 49915 1727204315.45616: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204315.45619: Calling groups_plugins_play to load vars for managed-node2 49915 1727204315.46991: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204315.48499: done with get_vars() 49915 1727204315.48521: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:58:35 -0400 (0:00:00.056) 0:00:22.192 ***** 49915 1727204315.48622: entering _queue_task() for managed-node2/service_facts 49915 1727204315.48937: worker is 1 (out of 1 available) 49915 1727204315.48949: exiting _queue_task() for managed-node2/service_facts 49915 1727204315.48963: done queuing things up, now waiting for results queue to drain 49915 1727204315.48964: waiting for pending results... 49915 1727204315.49283: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 49915 1727204315.49484: in run() - task 028d2410-947f-dcd7-b5af-000000000888 49915 1727204315.49490: variable 'ansible_search_path' from source: unknown 49915 1727204315.49493: variable 'ansible_search_path' from source: unknown 49915 1727204315.49496: calling self._execute() 49915 1727204315.49558: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204315.49564: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204315.49574: variable 'omit' from source: magic vars 49915 1727204315.49955: variable 'ansible_distribution_major_version' from source: facts 49915 1727204315.49966: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204315.49972: variable 'omit' from source: magic vars 49915 1727204315.50045: variable 'omit' from source: magic vars 49915 1727204315.50140: variable 'omit' from source: magic vars 49915 1727204315.50143: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204315.50168: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204315.50188: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204315.50205: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204315.50217: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204315.50248: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204315.50251: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204315.50254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204315.50348: Set connection var ansible_connection to ssh 49915 1727204315.50356: Set connection var ansible_shell_type to sh 49915 1727204315.50359: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204315.50368: Set connection var ansible_shell_executable to /bin/sh 49915 1727204315.50373: Set connection var ansible_timeout to 10 49915 1727204315.50385: Set connection var ansible_pipelining to False 49915 1727204315.50408: variable 'ansible_shell_executable' from source: unknown 49915 1727204315.50411: variable 'ansible_connection' from source: unknown 49915 1727204315.50416: variable 'ansible_module_compression' from source: unknown 49915 1727204315.50419: variable 'ansible_shell_type' from source: unknown 49915 1727204315.50422: variable 'ansible_shell_executable' from source: unknown 49915 1727204315.50424: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204315.50427: variable 'ansible_pipelining' from source: unknown 49915 1727204315.50429: variable 'ansible_timeout' from source: unknown 49915 1727204315.50431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204315.51086: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 49915 1727204315.51091: variable 'omit' from source: magic vars 49915 1727204315.51093: starting attempt loop 49915 1727204315.51095: running the handler 49915 1727204315.51097: _low_level_execute_command(): starting 49915 1727204315.51099: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49915 1727204315.51373: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 49915 1727204315.51465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204315.51522: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204315.51588: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204315.53424: stdout chunk (state=3): >>>/root <<< 49915 1727204315.53484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204315.53497: stdout chunk (state=3): >>><<< 49915 1727204315.53514: stderr chunk (state=3): >>><<< 49915 1727204315.53547: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204315.53567: _low_level_execute_command(): starting 49915 1727204315.53581: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204315.5355399-51379-272654484401292 `" && echo ansible-tmp-1727204315.5355399-51379-272654484401292="` echo /root/.ansible/tmp/ansible-tmp-1727204315.5355399-51379-272654484401292 `" ) && sleep 0' 49915 1727204315.54270: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204315.54302: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204315.54345: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204315.54418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204315.56311: stdout chunk (state=3): >>>ansible-tmp-1727204315.5355399-51379-272654484401292=/root/.ansible/tmp/ansible-tmp-1727204315.5355399-51379-272654484401292 <<< 49915 1727204315.56422: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204315.56457: stderr chunk (state=3): >>><<< 49915 1727204315.56460: stdout chunk (state=3): >>><<< 49915 1727204315.56471: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204315.5355399-51379-272654484401292=/root/.ansible/tmp/ansible-tmp-1727204315.5355399-51379-272654484401292 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204315.56581: variable 'ansible_module_compression' from source: unknown 49915 1727204315.56585: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49915ogiz3nec/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 49915 1727204315.56587: variable 'ansible_facts' from source: unknown 49915 1727204315.56645: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204315.5355399-51379-272654484401292/AnsiballZ_service_facts.py 49915 1727204315.56756: Sending initial data 49915 1727204315.56760: Sent initial data (162 bytes) 49915 1727204315.57210: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204315.57214: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 49915 1727204315.57217: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 49915 1727204315.57220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204315.57263: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204315.57266: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204315.57343: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204315.58902: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49915 1727204315.58998: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49915 1727204315.59081: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49915ogiz3nec/tmpf5u_ye56 /root/.ansible/tmp/ansible-tmp-1727204315.5355399-51379-272654484401292/AnsiballZ_service_facts.py <<< 49915 1727204315.59085: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204315.5355399-51379-272654484401292/AnsiballZ_service_facts.py" <<< 49915 1727204315.59144: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49915ogiz3nec/tmpf5u_ye56" to remote "/root/.ansible/tmp/ansible-tmp-1727204315.5355399-51379-272654484401292/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204315.5355399-51379-272654484401292/AnsiballZ_service_facts.py" <<< 49915 1727204315.60331: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204315.60512: stderr chunk (state=3): >>><<< 49915 1727204315.60516: stdout chunk (state=3): >>><<< 49915 1727204315.60528: done transferring module to remote 49915 1727204315.60534: _low_level_execute_command(): starting 49915 1727204315.60536: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204315.5355399-51379-272654484401292/ /root/.ansible/tmp/ansible-tmp-1727204315.5355399-51379-272654484401292/AnsiballZ_service_facts.py && sleep 0' 49915 1727204315.60918: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 49915 1727204315.60931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204315.60981: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204315.60995: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204315.61073: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204315.62840: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204315.62844: stdout chunk (state=3): >>><<< 49915 1727204315.62850: stderr chunk (state=3): >>><<< 49915 1727204315.62864: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204315.62867: _low_level_execute_command(): starting 49915 1727204315.62871: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204315.5355399-51379-272654484401292/AnsiballZ_service_facts.py && sleep 0' 49915 1727204315.63481: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204315.63525: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204315.63538: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204315.63577: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204315.63690: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204317.20946: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 49915 1727204317.20957: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "<<< 49915 1727204317.20964: stdout chunk (state=3): >>>inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "nmstate.service": {"name": "nmstate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 49915 1727204317.22553: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 49915 1727204317.22556: stdout chunk (state=3): >>><<< 49915 1727204317.22559: stderr chunk (state=3): >>><<< 49915 1727204317.22592: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "nmstate.service": {"name": "nmstate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 49915 1727204317.23656: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204315.5355399-51379-272654484401292/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49915 1727204317.23674: _low_level_execute_command(): starting 49915 1727204317.23691: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204315.5355399-51379-272654484401292/ > /dev/null 2>&1 && sleep 0' 49915 1727204317.24309: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204317.24326: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204317.24339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204317.24359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204317.24374: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 49915 1727204317.24469: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204317.24493: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204317.24507: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204317.24609: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204317.26679: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204317.26683: stdout chunk (state=3): >>><<< 49915 1727204317.26686: stderr chunk (state=3): >>><<< 49915 1727204317.26701: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204317.26715: handler run complete 49915 1727204317.26988: variable 'ansible_facts' from source: unknown 49915 1727204317.27145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204317.27644: variable 'ansible_facts' from source: unknown 49915 1727204317.27788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204317.28180: attempt loop complete, returning result 49915 1727204317.28183: _execute() done 49915 1727204317.28186: dumping result to json 49915 1727204317.28188: done dumping result, returning 49915 1727204317.28190: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [028d2410-947f-dcd7-b5af-000000000888] 49915 1727204317.28192: sending task result for task 028d2410-947f-dcd7-b5af-000000000888 49915 1727204317.29435: done sending task result for task 028d2410-947f-dcd7-b5af-000000000888 49915 1727204317.29438: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 49915 1727204317.29554: no more pending results, returning what we have 49915 1727204317.29557: results queue empty 49915 1727204317.29558: checking for any_errors_fatal 49915 1727204317.29561: done checking for any_errors_fatal 49915 1727204317.29562: checking for max_fail_percentage 49915 1727204317.29563: done checking for max_fail_percentage 49915 1727204317.29564: checking to see if all hosts have failed and the running result is not ok 49915 1727204317.29565: done checking to see if all hosts have failed 49915 1727204317.29566: getting the remaining hosts for this loop 49915 1727204317.29567: done getting the remaining hosts for this loop 49915 1727204317.29570: getting the next task for host managed-node2 49915 1727204317.29579: done getting next task for host managed-node2 49915 1727204317.29583: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 49915 1727204317.29586: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204317.29596: getting variables 49915 1727204317.29598: in VariableManager get_vars() 49915 1727204317.29636: Calling all_inventory to load vars for managed-node2 49915 1727204317.29639: Calling groups_inventory to load vars for managed-node2 49915 1727204317.29642: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204317.29651: Calling all_plugins_play to load vars for managed-node2 49915 1727204317.29653: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204317.29656: Calling groups_plugins_play to load vars for managed-node2 49915 1727204317.30877: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204317.32478: done with get_vars() 49915 1727204317.32502: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:58:37 -0400 (0:00:01.839) 0:00:24.032 ***** 49915 1727204317.32603: entering _queue_task() for managed-node2/package_facts 49915 1727204317.33011: worker is 1 (out of 1 available) 49915 1727204317.33024: exiting _queue_task() for managed-node2/package_facts 49915 1727204317.33035: done queuing things up, now waiting for results queue to drain 49915 1727204317.33036: waiting for pending results... 49915 1727204317.33396: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 49915 1727204317.33407: in run() - task 028d2410-947f-dcd7-b5af-000000000889 49915 1727204317.33430: variable 'ansible_search_path' from source: unknown 49915 1727204317.33438: variable 'ansible_search_path' from source: unknown 49915 1727204317.33482: calling self._execute() 49915 1727204317.33582: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204317.33605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204317.33623: variable 'omit' from source: magic vars 49915 1727204317.34027: variable 'ansible_distribution_major_version' from source: facts 49915 1727204317.34051: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204317.34063: variable 'omit' from source: magic vars 49915 1727204317.34146: variable 'omit' from source: magic vars 49915 1727204317.34190: variable 'omit' from source: magic vars 49915 1727204317.34255: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204317.34282: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204317.34308: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204317.34364: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204317.34367: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204317.34388: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204317.34397: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204317.34405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204317.34512: Set connection var ansible_connection to ssh 49915 1727204317.34583: Set connection var ansible_shell_type to sh 49915 1727204317.34586: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204317.34589: Set connection var ansible_shell_executable to /bin/sh 49915 1727204317.34591: Set connection var ansible_timeout to 10 49915 1727204317.34594: Set connection var ansible_pipelining to False 49915 1727204317.34596: variable 'ansible_shell_executable' from source: unknown 49915 1727204317.34598: variable 'ansible_connection' from source: unknown 49915 1727204317.34600: variable 'ansible_module_compression' from source: unknown 49915 1727204317.34607: variable 'ansible_shell_type' from source: unknown 49915 1727204317.34615: variable 'ansible_shell_executable' from source: unknown 49915 1727204317.34622: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204317.34630: variable 'ansible_pipelining' from source: unknown 49915 1727204317.34636: variable 'ansible_timeout' from source: unknown 49915 1727204317.34645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204317.34851: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 49915 1727204317.34868: variable 'omit' from source: magic vars 49915 1727204317.34879: starting attempt loop 49915 1727204317.34909: running the handler 49915 1727204317.34913: _low_level_execute_command(): starting 49915 1727204317.34923: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49915 1727204317.35706: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204317.35804: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204317.35845: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204317.35864: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204317.35881: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204317.36006: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204317.37714: stdout chunk (state=3): >>>/root <<< 49915 1727204317.37873: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204317.37879: stdout chunk (state=3): >>><<< 49915 1727204317.37882: stderr chunk (state=3): >>><<< 49915 1727204317.37903: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204317.37925: _low_level_execute_command(): starting 49915 1727204317.37936: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204317.3791125-51671-83929108781414 `" && echo ansible-tmp-1727204317.3791125-51671-83929108781414="` echo /root/.ansible/tmp/ansible-tmp-1727204317.3791125-51671-83929108781414 `" ) && sleep 0' 49915 1727204317.38561: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204317.38669: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204317.38673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204317.38733: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204317.38819: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204317.40792: stdout chunk (state=3): >>>ansible-tmp-1727204317.3791125-51671-83929108781414=/root/.ansible/tmp/ansible-tmp-1727204317.3791125-51671-83929108781414 <<< 49915 1727204317.40952: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204317.40956: stdout chunk (state=3): >>><<< 49915 1727204317.40959: stderr chunk (state=3): >>><<< 49915 1727204317.40978: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204317.3791125-51671-83929108781414=/root/.ansible/tmp/ansible-tmp-1727204317.3791125-51671-83929108781414 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204317.41034: variable 'ansible_module_compression' from source: unknown 49915 1727204317.41099: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49915ogiz3nec/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 49915 1727204317.41202: variable 'ansible_facts' from source: unknown 49915 1727204317.41400: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204317.3791125-51671-83929108781414/AnsiballZ_package_facts.py 49915 1727204317.41648: Sending initial data 49915 1727204317.41651: Sent initial data (161 bytes) 49915 1727204317.42613: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204317.42628: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204317.42642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204317.42697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204317.42764: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204317.42783: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204317.42811: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204317.42922: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204317.44533: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49915 1727204317.44624: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49915 1727204317.44693: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49915ogiz3nec/tmpukltsy40 /root/.ansible/tmp/ansible-tmp-1727204317.3791125-51671-83929108781414/AnsiballZ_package_facts.py <<< 49915 1727204317.44716: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204317.3791125-51671-83929108781414/AnsiballZ_package_facts.py" <<< 49915 1727204317.44782: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49915ogiz3nec/tmpukltsy40" to remote "/root/.ansible/tmp/ansible-tmp-1727204317.3791125-51671-83929108781414/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204317.3791125-51671-83929108781414/AnsiballZ_package_facts.py" <<< 49915 1727204317.46408: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204317.46502: stderr chunk (state=3): >>><<< 49915 1727204317.46506: stdout chunk (state=3): >>><<< 49915 1727204317.46518: done transferring module to remote 49915 1727204317.46535: _low_level_execute_command(): starting 49915 1727204317.46618: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204317.3791125-51671-83929108781414/ /root/.ansible/tmp/ansible-tmp-1727204317.3791125-51671-83929108781414/AnsiballZ_package_facts.py && sleep 0' 49915 1727204317.47171: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204317.47189: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204317.47210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204317.47235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204317.47252: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 49915 1727204317.47264: stderr chunk (state=3): >>>debug2: match not found <<< 49915 1727204317.47331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204317.47371: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204317.47392: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204317.47421: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204317.47550: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204317.49370: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204317.49397: stdout chunk (state=3): >>><<< 49915 1727204317.49399: stderr chunk (state=3): >>><<< 49915 1727204317.49414: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204317.49481: _low_level_execute_command(): starting 49915 1727204317.49485: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204317.3791125-51671-83929108781414/AnsiballZ_package_facts.py && sleep 0' 49915 1727204317.50080: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204317.50097: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204317.50111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204317.50139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204317.50156: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 49915 1727204317.50256: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204317.50271: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204317.50290: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204317.50316: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204317.50432: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204317.95238: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 49915 1727204317.95294: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 49915 1727204317.95322: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 49915 1727204317.95388: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [<<< 49915 1727204317.95401: stdout chunk (state=3): >>>{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "<<< 49915 1727204317.95406: stdout chunk (state=3): >>>3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch<<< 49915 1727204317.95426: stdout chunk (state=3): >>>": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch",<<< 49915 1727204317.95443: stdout chunk (state=3): >>> "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch<<< 49915 1727204317.95450: stdout chunk (state=3): >>>": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cl<<< 49915 1727204317.95481: stdout chunk (state=3): >>>oud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nmstate": [{"name": "nmstate", "version": "2.2.35", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-config-server": [{"name": "NetworkManager-config-server", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "nmstate-libs": [{"name": "nmstate-libs", "version": "2.2.35", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libnmstate": [{"name": "python3-libnmstate", "version": "2.2.35", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 49915 1727204317.97185: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 49915 1727204317.97215: stderr chunk (state=3): >>><<< 49915 1727204317.97218: stdout chunk (state=3): >>><<< 49915 1727204317.97257: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nmstate": [{"name": "nmstate", "version": "2.2.35", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-config-server": [{"name": "NetworkManager-config-server", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "nmstate-libs": [{"name": "nmstate-libs", "version": "2.2.35", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libnmstate": [{"name": "python3-libnmstate", "version": "2.2.35", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 49915 1727204317.98939: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204317.3791125-51671-83929108781414/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49915 1727204317.98943: _low_level_execute_command(): starting 49915 1727204317.98952: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204317.3791125-51671-83929108781414/ > /dev/null 2>&1 && sleep 0' 49915 1727204317.99550: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204317.99567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204317.99590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204317.99620: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204317.99634: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204317.99715: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204318.01591: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204318.01627: stderr chunk (state=3): >>><<< 49915 1727204318.01630: stdout chunk (state=3): >>><<< 49915 1727204318.01633: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204318.01635: handler run complete 49915 1727204318.02150: variable 'ansible_facts' from source: unknown 49915 1727204318.02466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204318.03568: variable 'ansible_facts' from source: unknown 49915 1727204318.04163: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204318.04606: attempt loop complete, returning result 49915 1727204318.04620: _execute() done 49915 1727204318.04623: dumping result to json 49915 1727204318.04825: done dumping result, returning 49915 1727204318.04828: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [028d2410-947f-dcd7-b5af-000000000889] 49915 1727204318.04833: sending task result for task 028d2410-947f-dcd7-b5af-000000000889 49915 1727204318.07199: done sending task result for task 028d2410-947f-dcd7-b5af-000000000889 49915 1727204318.07207: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 49915 1727204318.07357: no more pending results, returning what we have 49915 1727204318.07360: results queue empty 49915 1727204318.07361: checking for any_errors_fatal 49915 1727204318.07365: done checking for any_errors_fatal 49915 1727204318.07366: checking for max_fail_percentage 49915 1727204318.07367: done checking for max_fail_percentage 49915 1727204318.07368: checking to see if all hosts have failed and the running result is not ok 49915 1727204318.07369: done checking to see if all hosts have failed 49915 1727204318.07369: getting the remaining hosts for this loop 49915 1727204318.07371: done getting the remaining hosts for this loop 49915 1727204318.07374: getting the next task for host managed-node2 49915 1727204318.07384: done getting next task for host managed-node2 49915 1727204318.07387: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 49915 1727204318.07390: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204318.07399: getting variables 49915 1727204318.07401: in VariableManager get_vars() 49915 1727204318.07438: Calling all_inventory to load vars for managed-node2 49915 1727204318.07441: Calling groups_inventory to load vars for managed-node2 49915 1727204318.07443: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204318.07452: Calling all_plugins_play to load vars for managed-node2 49915 1727204318.07454: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204318.07457: Calling groups_plugins_play to load vars for managed-node2 49915 1727204318.08695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204318.10401: done with get_vars() 49915 1727204318.10426: done getting variables 49915 1727204318.10501: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:58:38 -0400 (0:00:00.779) 0:00:24.811 ***** 49915 1727204318.10536: entering _queue_task() for managed-node2/debug 49915 1727204318.11095: worker is 1 (out of 1 available) 49915 1727204318.11106: exiting _queue_task() for managed-node2/debug 49915 1727204318.11116: done queuing things up, now waiting for results queue to drain 49915 1727204318.11117: waiting for pending results... 49915 1727204318.11246: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 49915 1727204318.11346: in run() - task 028d2410-947f-dcd7-b5af-000000000066 49915 1727204318.11365: variable 'ansible_search_path' from source: unknown 49915 1727204318.11371: variable 'ansible_search_path' from source: unknown 49915 1727204318.11412: calling self._execute() 49915 1727204318.11561: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204318.11564: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204318.11567: variable 'omit' from source: magic vars 49915 1727204318.11930: variable 'ansible_distribution_major_version' from source: facts 49915 1727204318.11948: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204318.11967: variable 'omit' from source: magic vars 49915 1727204318.12047: variable 'omit' from source: magic vars 49915 1727204318.12178: variable 'network_provider' from source: set_fact 49915 1727204318.12208: variable 'omit' from source: magic vars 49915 1727204318.12269: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204318.12325: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204318.12367: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204318.12405: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204318.12432: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204318.12471: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204318.12535: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204318.12539: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204318.12641: Set connection var ansible_connection to ssh 49915 1727204318.12664: Set connection var ansible_shell_type to sh 49915 1727204318.12752: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204318.12756: Set connection var ansible_shell_executable to /bin/sh 49915 1727204318.12759: Set connection var ansible_timeout to 10 49915 1727204318.12761: Set connection var ansible_pipelining to False 49915 1727204318.12763: variable 'ansible_shell_executable' from source: unknown 49915 1727204318.12766: variable 'ansible_connection' from source: unknown 49915 1727204318.12768: variable 'ansible_module_compression' from source: unknown 49915 1727204318.12770: variable 'ansible_shell_type' from source: unknown 49915 1727204318.12772: variable 'ansible_shell_executable' from source: unknown 49915 1727204318.12774: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204318.12781: variable 'ansible_pipelining' from source: unknown 49915 1727204318.12784: variable 'ansible_timeout' from source: unknown 49915 1727204318.12786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204318.12955: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204318.12978: variable 'omit' from source: magic vars 49915 1727204318.12999: starting attempt loop 49915 1727204318.13007: running the handler 49915 1727204318.13080: handler run complete 49915 1727204318.13084: attempt loop complete, returning result 49915 1727204318.13086: _execute() done 49915 1727204318.13089: dumping result to json 49915 1727204318.13097: done dumping result, returning 49915 1727204318.13113: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [028d2410-947f-dcd7-b5af-000000000066] 49915 1727204318.13189: sending task result for task 028d2410-947f-dcd7-b5af-000000000066 ok: [managed-node2] => {} MSG: Using network provider: nm 49915 1727204318.13358: no more pending results, returning what we have 49915 1727204318.13362: results queue empty 49915 1727204318.13363: checking for any_errors_fatal 49915 1727204318.13371: done checking for any_errors_fatal 49915 1727204318.13372: checking for max_fail_percentage 49915 1727204318.13374: done checking for max_fail_percentage 49915 1727204318.13379: checking to see if all hosts have failed and the running result is not ok 49915 1727204318.13380: done checking to see if all hosts have failed 49915 1727204318.13381: getting the remaining hosts for this loop 49915 1727204318.13382: done getting the remaining hosts for this loop 49915 1727204318.13386: getting the next task for host managed-node2 49915 1727204318.13393: done getting next task for host managed-node2 49915 1727204318.13397: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 49915 1727204318.13402: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204318.13414: getting variables 49915 1727204318.13417: in VariableManager get_vars() 49915 1727204318.13458: Calling all_inventory to load vars for managed-node2 49915 1727204318.13461: Calling groups_inventory to load vars for managed-node2 49915 1727204318.13464: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204318.13475: Calling all_plugins_play to load vars for managed-node2 49915 1727204318.13747: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204318.13753: done sending task result for task 028d2410-947f-dcd7-b5af-000000000066 49915 1727204318.13756: WORKER PROCESS EXITING 49915 1727204318.13760: Calling groups_plugins_play to load vars for managed-node2 49915 1727204318.15079: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204318.16646: done with get_vars() 49915 1727204318.16670: done getting variables 49915 1727204318.16730: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:58:38 -0400 (0:00:00.062) 0:00:24.873 ***** 49915 1727204318.16763: entering _queue_task() for managed-node2/fail 49915 1727204318.17089: worker is 1 (out of 1 available) 49915 1727204318.17102: exiting _queue_task() for managed-node2/fail 49915 1727204318.17115: done queuing things up, now waiting for results queue to drain 49915 1727204318.17117: waiting for pending results... 49915 1727204318.17397: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 49915 1727204318.17541: in run() - task 028d2410-947f-dcd7-b5af-000000000067 49915 1727204318.17558: variable 'ansible_search_path' from source: unknown 49915 1727204318.17565: variable 'ansible_search_path' from source: unknown 49915 1727204318.17609: calling self._execute() 49915 1727204318.17709: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204318.17722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204318.17741: variable 'omit' from source: magic vars 49915 1727204318.18102: variable 'ansible_distribution_major_version' from source: facts 49915 1727204318.18118: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204318.18244: variable 'network_state' from source: role '' defaults 49915 1727204318.18261: Evaluated conditional (network_state != {}): False 49915 1727204318.18271: when evaluation is False, skipping this task 49915 1727204318.18285: _execute() done 49915 1727204318.18293: dumping result to json 49915 1727204318.18301: done dumping result, returning 49915 1727204318.18384: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [028d2410-947f-dcd7-b5af-000000000067] 49915 1727204318.18388: sending task result for task 028d2410-947f-dcd7-b5af-000000000067 49915 1727204318.18462: done sending task result for task 028d2410-947f-dcd7-b5af-000000000067 49915 1727204318.18466: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 49915 1727204318.18523: no more pending results, returning what we have 49915 1727204318.18529: results queue empty 49915 1727204318.18529: checking for any_errors_fatal 49915 1727204318.18536: done checking for any_errors_fatal 49915 1727204318.18537: checking for max_fail_percentage 49915 1727204318.18539: done checking for max_fail_percentage 49915 1727204318.18540: checking to see if all hosts have failed and the running result is not ok 49915 1727204318.18541: done checking to see if all hosts have failed 49915 1727204318.18541: getting the remaining hosts for this loop 49915 1727204318.18543: done getting the remaining hosts for this loop 49915 1727204318.18547: getting the next task for host managed-node2 49915 1727204318.18556: done getting next task for host managed-node2 49915 1727204318.18561: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 49915 1727204318.18564: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204318.18690: getting variables 49915 1727204318.18693: in VariableManager get_vars() 49915 1727204318.18738: Calling all_inventory to load vars for managed-node2 49915 1727204318.18742: Calling groups_inventory to load vars for managed-node2 49915 1727204318.18744: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204318.18758: Calling all_plugins_play to load vars for managed-node2 49915 1727204318.18761: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204318.18764: Calling groups_plugins_play to load vars for managed-node2 49915 1727204318.20229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204318.23188: done with get_vars() 49915 1727204318.23218: done getting variables 49915 1727204318.23282: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:58:38 -0400 (0:00:00.066) 0:00:24.940 ***** 49915 1727204318.23448: entering _queue_task() for managed-node2/fail 49915 1727204318.23796: worker is 1 (out of 1 available) 49915 1727204318.23810: exiting _queue_task() for managed-node2/fail 49915 1727204318.23822: done queuing things up, now waiting for results queue to drain 49915 1727204318.23824: waiting for pending results... 49915 1727204318.24208: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 49915 1727204318.24305: in run() - task 028d2410-947f-dcd7-b5af-000000000068 49915 1727204318.24310: variable 'ansible_search_path' from source: unknown 49915 1727204318.24313: variable 'ansible_search_path' from source: unknown 49915 1727204318.24345: calling self._execute() 49915 1727204318.24482: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204318.24486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204318.24489: variable 'omit' from source: magic vars 49915 1727204318.24866: variable 'ansible_distribution_major_version' from source: facts 49915 1727204318.24887: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204318.25020: variable 'network_state' from source: role '' defaults 49915 1727204318.25034: Evaluated conditional (network_state != {}): False 49915 1727204318.25064: when evaluation is False, skipping this task 49915 1727204318.25066: _execute() done 49915 1727204318.25068: dumping result to json 49915 1727204318.25071: done dumping result, returning 49915 1727204318.25073: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [028d2410-947f-dcd7-b5af-000000000068] 49915 1727204318.25078: sending task result for task 028d2410-947f-dcd7-b5af-000000000068 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 49915 1727204318.25328: no more pending results, returning what we have 49915 1727204318.25332: results queue empty 49915 1727204318.25333: checking for any_errors_fatal 49915 1727204318.25343: done checking for any_errors_fatal 49915 1727204318.25344: checking for max_fail_percentage 49915 1727204318.25346: done checking for max_fail_percentage 49915 1727204318.25347: checking to see if all hosts have failed and the running result is not ok 49915 1727204318.25348: done checking to see if all hosts have failed 49915 1727204318.25349: getting the remaining hosts for this loop 49915 1727204318.25350: done getting the remaining hosts for this loop 49915 1727204318.25355: getting the next task for host managed-node2 49915 1727204318.25362: done getting next task for host managed-node2 49915 1727204318.25366: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 49915 1727204318.25370: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204318.25394: getting variables 49915 1727204318.25396: in VariableManager get_vars() 49915 1727204318.25439: Calling all_inventory to load vars for managed-node2 49915 1727204318.25442: Calling groups_inventory to load vars for managed-node2 49915 1727204318.25444: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204318.25455: Calling all_plugins_play to load vars for managed-node2 49915 1727204318.25458: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204318.25461: Calling groups_plugins_play to load vars for managed-node2 49915 1727204318.26084: done sending task result for task 028d2410-947f-dcd7-b5af-000000000068 49915 1727204318.26088: WORKER PROCESS EXITING 49915 1727204318.28147: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204318.29866: done with get_vars() 49915 1727204318.29895: done getting variables 49915 1727204318.29957: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:58:38 -0400 (0:00:00.065) 0:00:25.006 ***** 49915 1727204318.29992: entering _queue_task() for managed-node2/fail 49915 1727204318.30323: worker is 1 (out of 1 available) 49915 1727204318.30335: exiting _queue_task() for managed-node2/fail 49915 1727204318.30347: done queuing things up, now waiting for results queue to drain 49915 1727204318.30348: waiting for pending results... 49915 1727204318.30636: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 49915 1727204318.30774: in run() - task 028d2410-947f-dcd7-b5af-000000000069 49915 1727204318.30804: variable 'ansible_search_path' from source: unknown 49915 1727204318.30812: variable 'ansible_search_path' from source: unknown 49915 1727204318.30851: calling self._execute() 49915 1727204318.30952: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204318.30964: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204318.30980: variable 'omit' from source: magic vars 49915 1727204318.31362: variable 'ansible_distribution_major_version' from source: facts 49915 1727204318.31382: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204318.31544: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49915 1727204318.33797: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49915 1727204318.33867: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49915 1727204318.33982: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49915 1727204318.33986: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49915 1727204318.33988: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49915 1727204318.34067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204318.34122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204318.34154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204318.34202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204318.34227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204318.34331: variable 'ansible_distribution_major_version' from source: facts 49915 1727204318.34352: Evaluated conditional (ansible_distribution_major_version | int > 9): True 49915 1727204318.34469: variable 'ansible_distribution' from source: facts 49915 1727204318.34480: variable '__network_rh_distros' from source: role '' defaults 49915 1727204318.34494: Evaluated conditional (ansible_distribution in __network_rh_distros): True 49915 1727204318.34980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204318.34983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204318.34985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204318.34987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204318.34989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204318.34991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204318.34993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204318.34994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204318.34996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204318.34998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204318.35007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204318.35031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204318.35056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204318.35096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204318.35120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204318.35442: variable 'network_connections' from source: task vars 49915 1727204318.35457: variable 'interface' from source: play vars 49915 1727204318.35524: variable 'interface' from source: play vars 49915 1727204318.35543: variable 'vlan_interface' from source: play vars 49915 1727204318.35611: variable 'vlan_interface' from source: play vars 49915 1727204318.35626: variable 'network_state' from source: role '' defaults 49915 1727204318.35710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49915 1727204318.35887: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49915 1727204318.35928: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49915 1727204318.35961: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49915 1727204318.35997: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49915 1727204318.36045: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 49915 1727204318.36082: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 49915 1727204318.36118: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204318.36149: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 49915 1727204318.36182: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 49915 1727204318.36190: when evaluation is False, skipping this task 49915 1727204318.36202: _execute() done 49915 1727204318.36210: dumping result to json 49915 1727204318.36219: done dumping result, returning 49915 1727204318.36231: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [028d2410-947f-dcd7-b5af-000000000069] 49915 1727204318.36309: sending task result for task 028d2410-947f-dcd7-b5af-000000000069 49915 1727204318.36383: done sending task result for task 028d2410-947f-dcd7-b5af-000000000069 49915 1727204318.36386: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 49915 1727204318.36462: no more pending results, returning what we have 49915 1727204318.36466: results queue empty 49915 1727204318.36467: checking for any_errors_fatal 49915 1727204318.36473: done checking for any_errors_fatal 49915 1727204318.36474: checking for max_fail_percentage 49915 1727204318.36478: done checking for max_fail_percentage 49915 1727204318.36479: checking to see if all hosts have failed and the running result is not ok 49915 1727204318.36480: done checking to see if all hosts have failed 49915 1727204318.36481: getting the remaining hosts for this loop 49915 1727204318.36483: done getting the remaining hosts for this loop 49915 1727204318.36487: getting the next task for host managed-node2 49915 1727204318.36494: done getting next task for host managed-node2 49915 1727204318.36498: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 49915 1727204318.36501: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204318.36519: getting variables 49915 1727204318.36521: in VariableManager get_vars() 49915 1727204318.36564: Calling all_inventory to load vars for managed-node2 49915 1727204318.36567: Calling groups_inventory to load vars for managed-node2 49915 1727204318.36570: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204318.36584: Calling all_plugins_play to load vars for managed-node2 49915 1727204318.36587: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204318.36591: Calling groups_plugins_play to load vars for managed-node2 49915 1727204318.38189: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204318.39888: done with get_vars() 49915 1727204318.39911: done getting variables 49915 1727204318.39971: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:58:38 -0400 (0:00:00.100) 0:00:25.106 ***** 49915 1727204318.40007: entering _queue_task() for managed-node2/dnf 49915 1727204318.40345: worker is 1 (out of 1 available) 49915 1727204318.40357: exiting _queue_task() for managed-node2/dnf 49915 1727204318.40370: done queuing things up, now waiting for results queue to drain 49915 1727204318.40372: waiting for pending results... 49915 1727204318.40792: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 49915 1727204318.40798: in run() - task 028d2410-947f-dcd7-b5af-00000000006a 49915 1727204318.40813: variable 'ansible_search_path' from source: unknown 49915 1727204318.40821: variable 'ansible_search_path' from source: unknown 49915 1727204318.40862: calling self._execute() 49915 1727204318.40957: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204318.40969: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204318.40984: variable 'omit' from source: magic vars 49915 1727204318.41363: variable 'ansible_distribution_major_version' from source: facts 49915 1727204318.41382: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204318.41590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49915 1727204318.43791: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49915 1727204318.43948: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49915 1727204318.43952: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49915 1727204318.43955: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49915 1727204318.43970: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49915 1727204318.44062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204318.44106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204318.44135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204318.44183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204318.44204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204318.44323: variable 'ansible_distribution' from source: facts 49915 1727204318.44333: variable 'ansible_distribution_major_version' from source: facts 49915 1727204318.44380: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 49915 1727204318.44470: variable '__network_wireless_connections_defined' from source: role '' defaults 49915 1727204318.44607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204318.44639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204318.44669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204318.44724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204318.44740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204318.44833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204318.44836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204318.44841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204318.44886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204318.44907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204318.44955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204318.44985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204318.45015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204318.45065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204318.45159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204318.45240: variable 'network_connections' from source: task vars 49915 1727204318.45257: variable 'interface' from source: play vars 49915 1727204318.45329: variable 'interface' from source: play vars 49915 1727204318.45344: variable 'vlan_interface' from source: play vars 49915 1727204318.45411: variable 'vlan_interface' from source: play vars 49915 1727204318.45488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49915 1727204318.45655: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49915 1727204318.45701: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49915 1727204318.45745: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49915 1727204318.45787: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49915 1727204318.45841: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 49915 1727204318.45853: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 49915 1727204318.45874: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204318.45894: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 49915 1727204318.45935: variable '__network_team_connections_defined' from source: role '' defaults 49915 1727204318.46106: variable 'network_connections' from source: task vars 49915 1727204318.46109: variable 'interface' from source: play vars 49915 1727204318.46154: variable 'interface' from source: play vars 49915 1727204318.46162: variable 'vlan_interface' from source: play vars 49915 1727204318.46206: variable 'vlan_interface' from source: play vars 49915 1727204318.46224: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 49915 1727204318.46227: when evaluation is False, skipping this task 49915 1727204318.46230: _execute() done 49915 1727204318.46232: dumping result to json 49915 1727204318.46234: done dumping result, returning 49915 1727204318.46244: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [028d2410-947f-dcd7-b5af-00000000006a] 49915 1727204318.46249: sending task result for task 028d2410-947f-dcd7-b5af-00000000006a 49915 1727204318.46335: done sending task result for task 028d2410-947f-dcd7-b5af-00000000006a 49915 1727204318.46338: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 49915 1727204318.46396: no more pending results, returning what we have 49915 1727204318.46399: results queue empty 49915 1727204318.46399: checking for any_errors_fatal 49915 1727204318.46407: done checking for any_errors_fatal 49915 1727204318.46407: checking for max_fail_percentage 49915 1727204318.46409: done checking for max_fail_percentage 49915 1727204318.46410: checking to see if all hosts have failed and the running result is not ok 49915 1727204318.46411: done checking to see if all hosts have failed 49915 1727204318.46411: getting the remaining hosts for this loop 49915 1727204318.46415: done getting the remaining hosts for this loop 49915 1727204318.46419: getting the next task for host managed-node2 49915 1727204318.46426: done getting next task for host managed-node2 49915 1727204318.46430: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 49915 1727204318.46432: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204318.46451: getting variables 49915 1727204318.46453: in VariableManager get_vars() 49915 1727204318.46494: Calling all_inventory to load vars for managed-node2 49915 1727204318.46497: Calling groups_inventory to load vars for managed-node2 49915 1727204318.46499: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204318.46513: Calling all_plugins_play to load vars for managed-node2 49915 1727204318.46586: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204318.46592: Calling groups_plugins_play to load vars for managed-node2 49915 1727204318.47531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204318.48961: done with get_vars() 49915 1727204318.48993: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 49915 1727204318.49064: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:58:38 -0400 (0:00:00.090) 0:00:25.197 ***** 49915 1727204318.49109: entering _queue_task() for managed-node2/yum 49915 1727204318.49444: worker is 1 (out of 1 available) 49915 1727204318.49457: exiting _queue_task() for managed-node2/yum 49915 1727204318.49471: done queuing things up, now waiting for results queue to drain 49915 1727204318.49473: waiting for pending results... 49915 1727204318.49891: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 49915 1727204318.49895: in run() - task 028d2410-947f-dcd7-b5af-00000000006b 49915 1727204318.49899: variable 'ansible_search_path' from source: unknown 49915 1727204318.49902: variable 'ansible_search_path' from source: unknown 49915 1727204318.49931: calling self._execute() 49915 1727204318.50032: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204318.50043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204318.50056: variable 'omit' from source: magic vars 49915 1727204318.50457: variable 'ansible_distribution_major_version' from source: facts 49915 1727204318.50510: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204318.50692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49915 1727204318.53311: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49915 1727204318.53356: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49915 1727204318.53387: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49915 1727204318.53415: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49915 1727204318.53437: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49915 1727204318.53499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204318.53778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204318.53799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204318.53832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204318.53844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204318.53914: variable 'ansible_distribution_major_version' from source: facts 49915 1727204318.53929: Evaluated conditional (ansible_distribution_major_version | int < 8): False 49915 1727204318.53932: when evaluation is False, skipping this task 49915 1727204318.53937: _execute() done 49915 1727204318.53939: dumping result to json 49915 1727204318.53942: done dumping result, returning 49915 1727204318.53948: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [028d2410-947f-dcd7-b5af-00000000006b] 49915 1727204318.53954: sending task result for task 028d2410-947f-dcd7-b5af-00000000006b 49915 1727204318.54040: done sending task result for task 028d2410-947f-dcd7-b5af-00000000006b 49915 1727204318.54043: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 49915 1727204318.54099: no more pending results, returning what we have 49915 1727204318.54102: results queue empty 49915 1727204318.54103: checking for any_errors_fatal 49915 1727204318.54108: done checking for any_errors_fatal 49915 1727204318.54109: checking for max_fail_percentage 49915 1727204318.54111: done checking for max_fail_percentage 49915 1727204318.54112: checking to see if all hosts have failed and the running result is not ok 49915 1727204318.54113: done checking to see if all hosts have failed 49915 1727204318.54113: getting the remaining hosts for this loop 49915 1727204318.54115: done getting the remaining hosts for this loop 49915 1727204318.54119: getting the next task for host managed-node2 49915 1727204318.54126: done getting next task for host managed-node2 49915 1727204318.54129: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 49915 1727204318.54132: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204318.54151: getting variables 49915 1727204318.54152: in VariableManager get_vars() 49915 1727204318.54202: Calling all_inventory to load vars for managed-node2 49915 1727204318.54205: Calling groups_inventory to load vars for managed-node2 49915 1727204318.54207: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204318.54219: Calling all_plugins_play to load vars for managed-node2 49915 1727204318.54228: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204318.54232: Calling groups_plugins_play to load vars for managed-node2 49915 1727204318.55662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204318.57428: done with get_vars() 49915 1727204318.57454: done getting variables 49915 1727204318.57519: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:58:38 -0400 (0:00:00.084) 0:00:25.281 ***** 49915 1727204318.57555: entering _queue_task() for managed-node2/fail 49915 1727204318.58292: worker is 1 (out of 1 available) 49915 1727204318.58308: exiting _queue_task() for managed-node2/fail 49915 1727204318.58320: done queuing things up, now waiting for results queue to drain 49915 1727204318.58322: waiting for pending results... 49915 1727204318.58722: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 49915 1727204318.58840: in run() - task 028d2410-947f-dcd7-b5af-00000000006c 49915 1727204318.58845: variable 'ansible_search_path' from source: unknown 49915 1727204318.58848: variable 'ansible_search_path' from source: unknown 49915 1727204318.58878: calling self._execute() 49915 1727204318.58983: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204318.59006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204318.59055: variable 'omit' from source: magic vars 49915 1727204318.59429: variable 'ansible_distribution_major_version' from source: facts 49915 1727204318.59451: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204318.59587: variable '__network_wireless_connections_defined' from source: role '' defaults 49915 1727204318.59809: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49915 1727204318.62595: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49915 1727204318.62704: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49915 1727204318.62732: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49915 1727204318.62774: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49915 1727204318.62820: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49915 1727204318.62924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204318.63033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204318.63036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204318.63064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204318.63087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204318.63154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204318.63186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204318.63221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204318.63277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204318.63297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204318.63358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204318.63580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204318.63584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204318.63587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204318.63589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204318.63671: variable 'network_connections' from source: task vars 49915 1727204318.63691: variable 'interface' from source: play vars 49915 1727204318.63773: variable 'interface' from source: play vars 49915 1727204318.63793: variable 'vlan_interface' from source: play vars 49915 1727204318.63863: variable 'vlan_interface' from source: play vars 49915 1727204318.63945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49915 1727204318.64109: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49915 1727204318.64156: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49915 1727204318.64190: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49915 1727204318.64224: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49915 1727204318.64274: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 49915 1727204318.64302: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 49915 1727204318.64333: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204318.64369: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 49915 1727204318.64424: variable '__network_team_connections_defined' from source: role '' defaults 49915 1727204318.64659: variable 'network_connections' from source: task vars 49915 1727204318.64684: variable 'interface' from source: play vars 49915 1727204318.64739: variable 'interface' from source: play vars 49915 1727204318.64793: variable 'vlan_interface' from source: play vars 49915 1727204318.64822: variable 'vlan_interface' from source: play vars 49915 1727204318.64851: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 49915 1727204318.64859: when evaluation is False, skipping this task 49915 1727204318.64866: _execute() done 49915 1727204318.64877: dumping result to json 49915 1727204318.64885: done dumping result, returning 49915 1727204318.64906: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [028d2410-947f-dcd7-b5af-00000000006c] 49915 1727204318.64983: sending task result for task 028d2410-947f-dcd7-b5af-00000000006c 49915 1727204318.65060: done sending task result for task 028d2410-947f-dcd7-b5af-00000000006c 49915 1727204318.65063: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 49915 1727204318.65136: no more pending results, returning what we have 49915 1727204318.65140: results queue empty 49915 1727204318.65140: checking for any_errors_fatal 49915 1727204318.65147: done checking for any_errors_fatal 49915 1727204318.65147: checking for max_fail_percentage 49915 1727204318.65149: done checking for max_fail_percentage 49915 1727204318.65150: checking to see if all hosts have failed and the running result is not ok 49915 1727204318.65151: done checking to see if all hosts have failed 49915 1727204318.65152: getting the remaining hosts for this loop 49915 1727204318.65154: done getting the remaining hosts for this loop 49915 1727204318.65157: getting the next task for host managed-node2 49915 1727204318.65164: done getting next task for host managed-node2 49915 1727204318.65167: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 49915 1727204318.65170: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204318.65190: getting variables 49915 1727204318.65191: in VariableManager get_vars() 49915 1727204318.65233: Calling all_inventory to load vars for managed-node2 49915 1727204318.65236: Calling groups_inventory to load vars for managed-node2 49915 1727204318.65238: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204318.65248: Calling all_plugins_play to load vars for managed-node2 49915 1727204318.65251: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204318.65253: Calling groups_plugins_play to load vars for managed-node2 49915 1727204318.66495: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204318.68089: done with get_vars() 49915 1727204318.68117: done getting variables 49915 1727204318.68180: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:58:38 -0400 (0:00:00.106) 0:00:25.388 ***** 49915 1727204318.68215: entering _queue_task() for managed-node2/package 49915 1727204318.68568: worker is 1 (out of 1 available) 49915 1727204318.68584: exiting _queue_task() for managed-node2/package 49915 1727204318.68597: done queuing things up, now waiting for results queue to drain 49915 1727204318.68598: waiting for pending results... 49915 1727204318.68998: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 49915 1727204318.69012: in run() - task 028d2410-947f-dcd7-b5af-00000000006d 49915 1727204318.69032: variable 'ansible_search_path' from source: unknown 49915 1727204318.69040: variable 'ansible_search_path' from source: unknown 49915 1727204318.69083: calling self._execute() 49915 1727204318.69186: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204318.69202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204318.69217: variable 'omit' from source: magic vars 49915 1727204318.69582: variable 'ansible_distribution_major_version' from source: facts 49915 1727204318.69598: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204318.69889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49915 1727204318.70046: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49915 1727204318.70095: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49915 1727204318.70132: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49915 1727204318.70205: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49915 1727204318.70317: variable 'network_packages' from source: role '' defaults 49915 1727204318.70427: variable '__network_provider_setup' from source: role '' defaults 49915 1727204318.70442: variable '__network_service_name_default_nm' from source: role '' defaults 49915 1727204318.70508: variable '__network_service_name_default_nm' from source: role '' defaults 49915 1727204318.70522: variable '__network_packages_default_nm' from source: role '' defaults 49915 1727204318.70587: variable '__network_packages_default_nm' from source: role '' defaults 49915 1727204318.70769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49915 1727204318.72930: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49915 1727204318.73005: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49915 1727204318.73184: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49915 1727204318.73187: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49915 1727204318.73188: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49915 1727204318.73190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204318.73207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204318.73234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204318.73273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204318.73292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204318.73341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204318.73428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204318.73457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204318.73515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204318.73539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204318.73851: variable '__network_packages_default_gobject_packages' from source: role '' defaults 49915 1727204318.74005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204318.74044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204318.74081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204318.74129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204318.74166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204318.74279: variable 'ansible_python' from source: facts 49915 1727204318.74311: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 49915 1727204318.74464: variable '__network_wpa_supplicant_required' from source: role '' defaults 49915 1727204318.74551: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 49915 1727204318.74827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204318.74830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204318.74849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204318.74926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204318.74950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204318.75053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204318.75149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204318.75167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204318.75370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204318.75373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204318.75473: variable 'network_connections' from source: task vars 49915 1727204318.75478: variable 'interface' from source: play vars 49915 1727204318.75573: variable 'interface' from source: play vars 49915 1727204318.75584: variable 'vlan_interface' from source: play vars 49915 1727204318.75704: variable 'vlan_interface' from source: play vars 49915 1727204318.75746: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 49915 1727204318.75771: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 49915 1727204318.75801: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204318.75879: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 49915 1727204318.75882: variable '__network_wireless_connections_defined' from source: role '' defaults 49915 1727204318.76140: variable 'network_connections' from source: task vars 49915 1727204318.76146: variable 'interface' from source: play vars 49915 1727204318.76246: variable 'interface' from source: play vars 49915 1727204318.76250: variable 'vlan_interface' from source: play vars 49915 1727204318.76344: variable 'vlan_interface' from source: play vars 49915 1727204318.76377: variable '__network_packages_default_wireless' from source: role '' defaults 49915 1727204318.76465: variable '__network_wireless_connections_defined' from source: role '' defaults 49915 1727204318.76747: variable 'network_connections' from source: task vars 49915 1727204318.76751: variable 'interface' from source: play vars 49915 1727204318.76881: variable 'interface' from source: play vars 49915 1727204318.76884: variable 'vlan_interface' from source: play vars 49915 1727204318.76887: variable 'vlan_interface' from source: play vars 49915 1727204318.76903: variable '__network_packages_default_team' from source: role '' defaults 49915 1727204318.76977: variable '__network_team_connections_defined' from source: role '' defaults 49915 1727204318.77263: variable 'network_connections' from source: task vars 49915 1727204318.77267: variable 'interface' from source: play vars 49915 1727204318.77339: variable 'interface' from source: play vars 49915 1727204318.77381: variable 'vlan_interface' from source: play vars 49915 1727204318.77496: variable 'vlan_interface' from source: play vars 49915 1727204318.77499: variable '__network_service_name_default_initscripts' from source: role '' defaults 49915 1727204318.77502: variable '__network_service_name_default_initscripts' from source: role '' defaults 49915 1727204318.77504: variable '__network_packages_default_initscripts' from source: role '' defaults 49915 1727204318.77562: variable '__network_packages_default_initscripts' from source: role '' defaults 49915 1727204318.77788: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 49915 1727204318.78240: variable 'network_connections' from source: task vars 49915 1727204318.78244: variable 'interface' from source: play vars 49915 1727204318.78303: variable 'interface' from source: play vars 49915 1727204318.78311: variable 'vlan_interface' from source: play vars 49915 1727204318.78366: variable 'vlan_interface' from source: play vars 49915 1727204318.78378: variable 'ansible_distribution' from source: facts 49915 1727204318.78381: variable '__network_rh_distros' from source: role '' defaults 49915 1727204318.78383: variable 'ansible_distribution_major_version' from source: facts 49915 1727204318.78483: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 49915 1727204318.78556: variable 'ansible_distribution' from source: facts 49915 1727204318.78560: variable '__network_rh_distros' from source: role '' defaults 49915 1727204318.78565: variable 'ansible_distribution_major_version' from source: facts 49915 1727204318.78579: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 49915 1727204318.78734: variable 'ansible_distribution' from source: facts 49915 1727204318.78738: variable '__network_rh_distros' from source: role '' defaults 49915 1727204318.78743: variable 'ansible_distribution_major_version' from source: facts 49915 1727204318.78779: variable 'network_provider' from source: set_fact 49915 1727204318.78797: variable 'ansible_facts' from source: unknown 49915 1727204318.79576: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 49915 1727204318.79679: when evaluation is False, skipping this task 49915 1727204318.79683: _execute() done 49915 1727204318.79685: dumping result to json 49915 1727204318.79687: done dumping result, returning 49915 1727204318.79690: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [028d2410-947f-dcd7-b5af-00000000006d] 49915 1727204318.79692: sending task result for task 028d2410-947f-dcd7-b5af-00000000006d 49915 1727204318.79773: done sending task result for task 028d2410-947f-dcd7-b5af-00000000006d 49915 1727204318.79779: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 49915 1727204318.79842: no more pending results, returning what we have 49915 1727204318.79846: results queue empty 49915 1727204318.79847: checking for any_errors_fatal 49915 1727204318.79854: done checking for any_errors_fatal 49915 1727204318.79855: checking for max_fail_percentage 49915 1727204318.79857: done checking for max_fail_percentage 49915 1727204318.79857: checking to see if all hosts have failed and the running result is not ok 49915 1727204318.79859: done checking to see if all hosts have failed 49915 1727204318.79859: getting the remaining hosts for this loop 49915 1727204318.79861: done getting the remaining hosts for this loop 49915 1727204318.79866: getting the next task for host managed-node2 49915 1727204318.79877: done getting next task for host managed-node2 49915 1727204318.79881: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 49915 1727204318.79885: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204318.79911: getting variables 49915 1727204318.79915: in VariableManager get_vars() 49915 1727204318.79964: Calling all_inventory to load vars for managed-node2 49915 1727204318.79967: Calling groups_inventory to load vars for managed-node2 49915 1727204318.79970: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204318.80171: Calling all_plugins_play to load vars for managed-node2 49915 1727204318.80177: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204318.80181: Calling groups_plugins_play to load vars for managed-node2 49915 1727204318.82106: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204318.83773: done with get_vars() 49915 1727204318.83799: done getting variables 49915 1727204318.83861: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:58:38 -0400 (0:00:00.156) 0:00:25.545 ***** 49915 1727204318.83895: entering _queue_task() for managed-node2/package 49915 1727204318.84323: worker is 1 (out of 1 available) 49915 1727204318.84337: exiting _queue_task() for managed-node2/package 49915 1727204318.84351: done queuing things up, now waiting for results queue to drain 49915 1727204318.84352: waiting for pending results... 49915 1727204318.84927: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 49915 1727204318.84932: in run() - task 028d2410-947f-dcd7-b5af-00000000006e 49915 1727204318.84935: variable 'ansible_search_path' from source: unknown 49915 1727204318.84938: variable 'ansible_search_path' from source: unknown 49915 1727204318.84941: calling self._execute() 49915 1727204318.85059: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204318.85073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204318.85091: variable 'omit' from source: magic vars 49915 1727204318.85597: variable 'ansible_distribution_major_version' from source: facts 49915 1727204318.85616: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204318.85751: variable 'network_state' from source: role '' defaults 49915 1727204318.85785: Evaluated conditional (network_state != {}): False 49915 1727204318.85788: when evaluation is False, skipping this task 49915 1727204318.85790: _execute() done 49915 1727204318.85792: dumping result to json 49915 1727204318.85794: done dumping result, returning 49915 1727204318.85882: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [028d2410-947f-dcd7-b5af-00000000006e] 49915 1727204318.85887: sending task result for task 028d2410-947f-dcd7-b5af-00000000006e 49915 1727204318.85963: done sending task result for task 028d2410-947f-dcd7-b5af-00000000006e 49915 1727204318.85966: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 49915 1727204318.86020: no more pending results, returning what we have 49915 1727204318.86024: results queue empty 49915 1727204318.86025: checking for any_errors_fatal 49915 1727204318.86031: done checking for any_errors_fatal 49915 1727204318.86031: checking for max_fail_percentage 49915 1727204318.86034: done checking for max_fail_percentage 49915 1727204318.86034: checking to see if all hosts have failed and the running result is not ok 49915 1727204318.86036: done checking to see if all hosts have failed 49915 1727204318.86036: getting the remaining hosts for this loop 49915 1727204318.86038: done getting the remaining hosts for this loop 49915 1727204318.86042: getting the next task for host managed-node2 49915 1727204318.86049: done getting next task for host managed-node2 49915 1727204318.86052: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 49915 1727204318.86056: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204318.86078: getting variables 49915 1727204318.86080: in VariableManager get_vars() 49915 1727204318.86125: Calling all_inventory to load vars for managed-node2 49915 1727204318.86128: Calling groups_inventory to load vars for managed-node2 49915 1727204318.86130: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204318.86143: Calling all_plugins_play to load vars for managed-node2 49915 1727204318.86146: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204318.86149: Calling groups_plugins_play to load vars for managed-node2 49915 1727204318.87791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204318.89671: done with get_vars() 49915 1727204318.89695: done getting variables 49915 1727204318.89766: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:58:38 -0400 (0:00:00.059) 0:00:25.604 ***** 49915 1727204318.89803: entering _queue_task() for managed-node2/package 49915 1727204318.90382: worker is 1 (out of 1 available) 49915 1727204318.90392: exiting _queue_task() for managed-node2/package 49915 1727204318.90401: done queuing things up, now waiting for results queue to drain 49915 1727204318.90403: waiting for pending results... 49915 1727204318.90649: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 49915 1727204318.90695: in run() - task 028d2410-947f-dcd7-b5af-00000000006f 49915 1727204318.90742: variable 'ansible_search_path' from source: unknown 49915 1727204318.90745: variable 'ansible_search_path' from source: unknown 49915 1727204318.90851: calling self._execute() 49915 1727204318.90926: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204318.91023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204318.91068: variable 'omit' from source: magic vars 49915 1727204318.91473: variable 'ansible_distribution_major_version' from source: facts 49915 1727204318.91579: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204318.91644: variable 'network_state' from source: role '' defaults 49915 1727204318.91660: Evaluated conditional (network_state != {}): False 49915 1727204318.91667: when evaluation is False, skipping this task 49915 1727204318.91674: _execute() done 49915 1727204318.91694: dumping result to json 49915 1727204318.91697: done dumping result, returning 49915 1727204318.91718: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [028d2410-947f-dcd7-b5af-00000000006f] 49915 1727204318.91800: sending task result for task 028d2410-947f-dcd7-b5af-00000000006f skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 49915 1727204318.91982: no more pending results, returning what we have 49915 1727204318.91987: results queue empty 49915 1727204318.91988: checking for any_errors_fatal 49915 1727204318.91993: done checking for any_errors_fatal 49915 1727204318.91994: checking for max_fail_percentage 49915 1727204318.91996: done checking for max_fail_percentage 49915 1727204318.91997: checking to see if all hosts have failed and the running result is not ok 49915 1727204318.91998: done checking to see if all hosts have failed 49915 1727204318.91999: getting the remaining hosts for this loop 49915 1727204318.92003: done getting the remaining hosts for this loop 49915 1727204318.92008: getting the next task for host managed-node2 49915 1727204318.92017: done getting next task for host managed-node2 49915 1727204318.92020: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 49915 1727204318.92024: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204318.92044: done sending task result for task 028d2410-947f-dcd7-b5af-00000000006f 49915 1727204318.92047: WORKER PROCESS EXITING 49915 1727204318.92160: getting variables 49915 1727204318.92162: in VariableManager get_vars() 49915 1727204318.92203: Calling all_inventory to load vars for managed-node2 49915 1727204318.92205: Calling groups_inventory to load vars for managed-node2 49915 1727204318.92207: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204318.92221: Calling all_plugins_play to load vars for managed-node2 49915 1727204318.92224: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204318.92227: Calling groups_plugins_play to load vars for managed-node2 49915 1727204318.94083: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204318.96287: done with get_vars() 49915 1727204318.96317: done getting variables 49915 1727204318.96386: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:58:38 -0400 (0:00:00.066) 0:00:25.670 ***** 49915 1727204318.96428: entering _queue_task() for managed-node2/service 49915 1727204318.96847: worker is 1 (out of 1 available) 49915 1727204318.96861: exiting _queue_task() for managed-node2/service 49915 1727204318.96873: done queuing things up, now waiting for results queue to drain 49915 1727204318.96874: waiting for pending results... 49915 1727204318.97493: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 49915 1727204318.97528: in run() - task 028d2410-947f-dcd7-b5af-000000000070 49915 1727204318.97549: variable 'ansible_search_path' from source: unknown 49915 1727204318.97557: variable 'ansible_search_path' from source: unknown 49915 1727204318.97606: calling self._execute() 49915 1727204318.97817: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204318.97829: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204318.97931: variable 'omit' from source: magic vars 49915 1727204318.98463: variable 'ansible_distribution_major_version' from source: facts 49915 1727204318.98483: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204318.98608: variable '__network_wireless_connections_defined' from source: role '' defaults 49915 1727204318.98812: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49915 1727204319.00975: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49915 1727204319.01049: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49915 1727204319.01092: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49915 1727204319.01137: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49915 1727204319.01169: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49915 1727204319.01256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204319.01305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204319.01336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204319.01386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204319.01407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204319.01456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204319.01489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204319.01520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204319.01562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204319.01588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204319.01633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204319.01682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204319.01694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204319.01735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204319.01881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204319.01927: variable 'network_connections' from source: task vars 49915 1727204319.01944: variable 'interface' from source: play vars 49915 1727204319.02017: variable 'interface' from source: play vars 49915 1727204319.02033: variable 'vlan_interface' from source: play vars 49915 1727204319.02103: variable 'vlan_interface' from source: play vars 49915 1727204319.02179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49915 1727204319.02352: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49915 1727204319.02394: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49915 1727204319.02436: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49915 1727204319.02469: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49915 1727204319.02516: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 49915 1727204319.02547: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 49915 1727204319.02583: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204319.02614: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 49915 1727204319.02674: variable '__network_team_connections_defined' from source: role '' defaults 49915 1727204319.02979: variable 'network_connections' from source: task vars 49915 1727204319.02984: variable 'interface' from source: play vars 49915 1727204319.03007: variable 'interface' from source: play vars 49915 1727204319.03019: variable 'vlan_interface' from source: play vars 49915 1727204319.03084: variable 'vlan_interface' from source: play vars 49915 1727204319.03117: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 49915 1727204319.03125: when evaluation is False, skipping this task 49915 1727204319.03133: _execute() done 49915 1727204319.03141: dumping result to json 49915 1727204319.03149: done dumping result, returning 49915 1727204319.03160: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [028d2410-947f-dcd7-b5af-000000000070] 49915 1727204319.03207: sending task result for task 028d2410-947f-dcd7-b5af-000000000070 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 49915 1727204319.03356: no more pending results, returning what we have 49915 1727204319.03359: results queue empty 49915 1727204319.03360: checking for any_errors_fatal 49915 1727204319.03366: done checking for any_errors_fatal 49915 1727204319.03367: checking for max_fail_percentage 49915 1727204319.03369: done checking for max_fail_percentage 49915 1727204319.03370: checking to see if all hosts have failed and the running result is not ok 49915 1727204319.03371: done checking to see if all hosts have failed 49915 1727204319.03372: getting the remaining hosts for this loop 49915 1727204319.03373: done getting the remaining hosts for this loop 49915 1727204319.03379: getting the next task for host managed-node2 49915 1727204319.03386: done getting next task for host managed-node2 49915 1727204319.03391: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 49915 1727204319.03394: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204319.03415: getting variables 49915 1727204319.03417: in VariableManager get_vars() 49915 1727204319.03461: Calling all_inventory to load vars for managed-node2 49915 1727204319.03464: Calling groups_inventory to load vars for managed-node2 49915 1727204319.03466: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204319.03691: Calling all_plugins_play to load vars for managed-node2 49915 1727204319.03696: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204319.03700: Calling groups_plugins_play to load vars for managed-node2 49915 1727204319.04307: done sending task result for task 028d2410-947f-dcd7-b5af-000000000070 49915 1727204319.04311: WORKER PROCESS EXITING 49915 1727204319.05306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204319.08428: done with get_vars() 49915 1727204319.08458: done getting variables 49915 1727204319.08638: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:58:39 -0400 (0:00:00.122) 0:00:25.792 ***** 49915 1727204319.08672: entering _queue_task() for managed-node2/service 49915 1727204319.09194: worker is 1 (out of 1 available) 49915 1727204319.09205: exiting _queue_task() for managed-node2/service 49915 1727204319.09216: done queuing things up, now waiting for results queue to drain 49915 1727204319.09217: waiting for pending results... 49915 1727204319.09413: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 49915 1727204319.09563: in run() - task 028d2410-947f-dcd7-b5af-000000000071 49915 1727204319.09583: variable 'ansible_search_path' from source: unknown 49915 1727204319.09591: variable 'ansible_search_path' from source: unknown 49915 1727204319.09630: calling self._execute() 49915 1727204319.09732: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204319.09745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204319.09759: variable 'omit' from source: magic vars 49915 1727204319.10133: variable 'ansible_distribution_major_version' from source: facts 49915 1727204319.10150: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204319.10328: variable 'network_provider' from source: set_fact 49915 1727204319.10338: variable 'network_state' from source: role '' defaults 49915 1727204319.10353: Evaluated conditional (network_provider == "nm" or network_state != {}): True 49915 1727204319.10362: variable 'omit' from source: magic vars 49915 1727204319.10431: variable 'omit' from source: magic vars 49915 1727204319.10464: variable 'network_service_name' from source: role '' defaults 49915 1727204319.10536: variable 'network_service_name' from source: role '' defaults 49915 1727204319.10649: variable '__network_provider_setup' from source: role '' defaults 49915 1727204319.10660: variable '__network_service_name_default_nm' from source: role '' defaults 49915 1727204319.10726: variable '__network_service_name_default_nm' from source: role '' defaults 49915 1727204319.10746: variable '__network_packages_default_nm' from source: role '' defaults 49915 1727204319.10814: variable '__network_packages_default_nm' from source: role '' defaults 49915 1727204319.11074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49915 1727204319.13463: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49915 1727204319.13785: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49915 1727204319.13789: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49915 1727204319.13892: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49915 1727204319.13896: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49915 1727204319.14023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204319.14060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204319.14095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204319.14247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204319.14336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204319.14338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204319.14341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204319.14399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204319.14439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204319.14466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204319.14754: variable '__network_packages_default_gobject_packages' from source: role '' defaults 49915 1727204319.14904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204319.14963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204319.15000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204319.15048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204319.15068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204319.15207: variable 'ansible_python' from source: facts 49915 1727204319.15249: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 49915 1727204319.15360: variable '__network_wpa_supplicant_required' from source: role '' defaults 49915 1727204319.15481: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 49915 1727204319.15598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204319.15630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204319.15689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204319.15769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204319.15772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204319.15825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204319.15886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204319.15983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204319.16001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204319.16021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204319.16242: variable 'network_connections' from source: task vars 49915 1727204319.16256: variable 'interface' from source: play vars 49915 1727204319.16382: variable 'interface' from source: play vars 49915 1727204319.16385: variable 'vlan_interface' from source: play vars 49915 1727204319.16515: variable 'vlan_interface' from source: play vars 49915 1727204319.16709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49915 1727204319.17192: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49915 1727204319.17195: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49915 1727204319.17305: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49915 1727204319.17359: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49915 1727204319.17517: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 49915 1727204319.17635: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 49915 1727204319.17670: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204319.17755: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 49915 1727204319.17882: variable '__network_wireless_connections_defined' from source: role '' defaults 49915 1727204319.18425: variable 'network_connections' from source: task vars 49915 1727204319.18436: variable 'interface' from source: play vars 49915 1727204319.18531: variable 'interface' from source: play vars 49915 1727204319.18548: variable 'vlan_interface' from source: play vars 49915 1727204319.18629: variable 'vlan_interface' from source: play vars 49915 1727204319.18669: variable '__network_packages_default_wireless' from source: role '' defaults 49915 1727204319.18762: variable '__network_wireless_connections_defined' from source: role '' defaults 49915 1727204319.19072: variable 'network_connections' from source: task vars 49915 1727204319.19085: variable 'interface' from source: play vars 49915 1727204319.19161: variable 'interface' from source: play vars 49915 1727204319.19237: variable 'vlan_interface' from source: play vars 49915 1727204319.19249: variable 'vlan_interface' from source: play vars 49915 1727204319.19277: variable '__network_packages_default_team' from source: role '' defaults 49915 1727204319.19362: variable '__network_team_connections_defined' from source: role '' defaults 49915 1727204319.19660: variable 'network_connections' from source: task vars 49915 1727204319.19679: variable 'interface' from source: play vars 49915 1727204319.19748: variable 'interface' from source: play vars 49915 1727204319.19759: variable 'vlan_interface' from source: play vars 49915 1727204319.19836: variable 'vlan_interface' from source: play vars 49915 1727204319.19900: variable '__network_service_name_default_initscripts' from source: role '' defaults 49915 1727204319.19963: variable '__network_service_name_default_initscripts' from source: role '' defaults 49915 1727204319.19974: variable '__network_packages_default_initscripts' from source: role '' defaults 49915 1727204319.20107: variable '__network_packages_default_initscripts' from source: role '' defaults 49915 1727204319.20270: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 49915 1727204319.20851: variable 'network_connections' from source: task vars 49915 1727204319.20868: variable 'interface' from source: play vars 49915 1727204319.20932: variable 'interface' from source: play vars 49915 1727204319.20946: variable 'vlan_interface' from source: play vars 49915 1727204319.21014: variable 'vlan_interface' from source: play vars 49915 1727204319.21027: variable 'ansible_distribution' from source: facts 49915 1727204319.21035: variable '__network_rh_distros' from source: role '' defaults 49915 1727204319.21045: variable 'ansible_distribution_major_version' from source: facts 49915 1727204319.21063: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 49915 1727204319.21241: variable 'ansible_distribution' from source: facts 49915 1727204319.21285: variable '__network_rh_distros' from source: role '' defaults 49915 1727204319.21289: variable 'ansible_distribution_major_version' from source: facts 49915 1727204319.21291: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 49915 1727204319.21448: variable 'ansible_distribution' from source: facts 49915 1727204319.21455: variable '__network_rh_distros' from source: role '' defaults 49915 1727204319.21463: variable 'ansible_distribution_major_version' from source: facts 49915 1727204319.21499: variable 'network_provider' from source: set_fact 49915 1727204319.21531: variable 'omit' from source: magic vars 49915 1727204319.21582: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204319.21593: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204319.21624: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204319.21650: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204319.21738: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204319.21742: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204319.21744: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204319.21746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204319.21964: Set connection var ansible_connection to ssh 49915 1727204319.21972: Set connection var ansible_shell_type to sh 49915 1727204319.21992: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204319.22008: Set connection var ansible_shell_executable to /bin/sh 49915 1727204319.22018: Set connection var ansible_timeout to 10 49915 1727204319.22064: Set connection var ansible_pipelining to False 49915 1727204319.22067: variable 'ansible_shell_executable' from source: unknown 49915 1727204319.22069: variable 'ansible_connection' from source: unknown 49915 1727204319.22074: variable 'ansible_module_compression' from source: unknown 49915 1727204319.22086: variable 'ansible_shell_type' from source: unknown 49915 1727204319.22096: variable 'ansible_shell_executable' from source: unknown 49915 1727204319.22111: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204319.22171: variable 'ansible_pipelining' from source: unknown 49915 1727204319.22174: variable 'ansible_timeout' from source: unknown 49915 1727204319.22179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204319.22296: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204319.22299: variable 'omit' from source: magic vars 49915 1727204319.22302: starting attempt loop 49915 1727204319.22304: running the handler 49915 1727204319.22410: variable 'ansible_facts' from source: unknown 49915 1727204319.23260: _low_level_execute_command(): starting 49915 1727204319.23274: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49915 1727204319.24093: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204319.24108: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204319.24191: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204319.24281: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204319.24316: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204319.24438: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204319.26281: stdout chunk (state=3): >>>/root <<< 49915 1727204319.26341: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204319.26352: stdout chunk (state=3): >>><<< 49915 1727204319.26364: stderr chunk (state=3): >>><<< 49915 1727204319.26393: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204319.26410: _low_level_execute_command(): starting 49915 1727204319.26424: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204319.2639956-51731-189322220315846 `" && echo ansible-tmp-1727204319.2639956-51731-189322220315846="` echo /root/.ansible/tmp/ansible-tmp-1727204319.2639956-51731-189322220315846 `" ) && sleep 0' 49915 1727204319.27292: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204319.27322: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204319.27338: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204319.27401: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204319.27501: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204319.29509: stdout chunk (state=3): >>>ansible-tmp-1727204319.2639956-51731-189322220315846=/root/.ansible/tmp/ansible-tmp-1727204319.2639956-51731-189322220315846 <<< 49915 1727204319.29580: stdout chunk (state=3): >>><<< 49915 1727204319.29642: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204319.29653: stderr chunk (state=3): >>><<< 49915 1727204319.29677: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204319.2639956-51731-189322220315846=/root/.ansible/tmp/ansible-tmp-1727204319.2639956-51731-189322220315846 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204319.29719: variable 'ansible_module_compression' from source: unknown 49915 1727204319.29915: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49915ogiz3nec/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 49915 1727204319.30018: variable 'ansible_facts' from source: unknown 49915 1727204319.30881: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204319.2639956-51731-189322220315846/AnsiballZ_systemd.py 49915 1727204319.31317: Sending initial data 49915 1727204319.31327: Sent initial data (156 bytes) 49915 1727204319.32516: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204319.32591: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204319.32716: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204319.32733: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204319.32758: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204319.32907: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204319.34607: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 49915 1727204319.34630: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49915 1727204319.34702: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49915 1727204319.34837: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49915ogiz3nec/tmp483h7jkh /root/.ansible/tmp/ansible-tmp-1727204319.2639956-51731-189322220315846/AnsiballZ_systemd.py <<< 49915 1727204319.34855: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204319.2639956-51731-189322220315846/AnsiballZ_systemd.py" <<< 49915 1727204319.34871: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49915ogiz3nec/tmp483h7jkh" to remote "/root/.ansible/tmp/ansible-tmp-1727204319.2639956-51731-189322220315846/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204319.2639956-51731-189322220315846/AnsiballZ_systemd.py" <<< 49915 1727204319.36703: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204319.36707: stdout chunk (state=3): >>><<< 49915 1727204319.36710: stderr chunk (state=3): >>><<< 49915 1727204319.36714: done transferring module to remote 49915 1727204319.36716: _low_level_execute_command(): starting 49915 1727204319.36718: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204319.2639956-51731-189322220315846/ /root/.ansible/tmp/ansible-tmp-1727204319.2639956-51731-189322220315846/AnsiballZ_systemd.py && sleep 0' 49915 1727204319.37227: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204319.37236: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204319.37247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204319.37261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204319.37274: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 49915 1727204319.37284: stderr chunk (state=3): >>>debug2: match not found <<< 49915 1727204319.37295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204319.37310: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 49915 1727204319.37388: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204319.37425: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204319.37517: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204319.39381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204319.39416: stderr chunk (state=3): >>><<< 49915 1727204319.39419: stdout chunk (state=3): >>><<< 49915 1727204319.39484: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204319.39488: _low_level_execute_command(): starting 49915 1727204319.39491: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204319.2639956-51731-189322220315846/AnsiballZ_systemd.py && sleep 0' 49915 1727204319.39916: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204319.40029: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204319.40038: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204319.40115: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204319.69438: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "7081", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ExecMainStartTimestampMonotonic": "294798591", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ExecMainHandoffTimestampMonotonic": "294813549", "ExecMainPID": "7081", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4312", "MemoryCurrent": "4657152", "MemoryPeak": "7655424", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3303030784", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "1896781000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredum<<< 49915 1727204319.69462: stdout chunk (state=3): >>>pReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target multi-user.target network.target cloud-init.service", "After": "dbus-broker.service cloud-init-local.service network-pre.target basic.target system.slice systemd-journald.socket sysinit.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:56:08 EDT", "StateChangeTimestampMonotonic": "755095855", "InactiveExitTimestamp": "Tue 2024-09-24 14:48:28 EDT", "InactiveExitTimestampMonotonic": "294799297", "ActiveEnterTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ActiveEnterTimestampMonotonic": "294888092", "ActiveExitTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ActiveExitTimestampMonotonic": "294768391", "InactiveEnterTimestamp": "Tue 2024-09-24 14:48:28 EDT", "InactiveEnterTimestampMonotonic": "294795966", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ConditionTimestampMonotonic": "294797207", "AssertTimestamp": "Tue 2024-09-24 14:48:28 EDT", "AssertTimestampMonotonic": "294797210", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "a167241d4c7945a58749ffeda353964d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 49915 1727204319.71527: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 49915 1727204319.71531: stderr chunk (state=3): >>><<< 49915 1727204319.71546: stdout chunk (state=3): >>><<< 49915 1727204319.71573: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "7081", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ExecMainStartTimestampMonotonic": "294798591", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ExecMainHandoffTimestampMonotonic": "294813549", "ExecMainPID": "7081", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4312", "MemoryCurrent": "4657152", "MemoryPeak": "7655424", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3303030784", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "1896781000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target multi-user.target network.target cloud-init.service", "After": "dbus-broker.service cloud-init-local.service network-pre.target basic.target system.slice systemd-journald.socket sysinit.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:56:08 EDT", "StateChangeTimestampMonotonic": "755095855", "InactiveExitTimestamp": "Tue 2024-09-24 14:48:28 EDT", "InactiveExitTimestampMonotonic": "294799297", "ActiveEnterTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ActiveEnterTimestampMonotonic": "294888092", "ActiveExitTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ActiveExitTimestampMonotonic": "294768391", "InactiveEnterTimestamp": "Tue 2024-09-24 14:48:28 EDT", "InactiveEnterTimestampMonotonic": "294795966", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:48:28 EDT", "ConditionTimestampMonotonic": "294797207", "AssertTimestamp": "Tue 2024-09-24 14:48:28 EDT", "AssertTimestampMonotonic": "294797210", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "a167241d4c7945a58749ffeda353964d", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 49915 1727204319.71862: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204319.2639956-51731-189322220315846/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49915 1727204319.71866: _low_level_execute_command(): starting 49915 1727204319.71869: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204319.2639956-51731-189322220315846/ > /dev/null 2>&1 && sleep 0' 49915 1727204319.72411: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204319.72425: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204319.72441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204319.72488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204319.72555: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204319.72579: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204319.72595: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204319.72729: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204319.74662: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204319.74665: stdout chunk (state=3): >>><<< 49915 1727204319.74668: stderr chunk (state=3): >>><<< 49915 1727204319.74683: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204319.74780: handler run complete 49915 1727204319.74783: attempt loop complete, returning result 49915 1727204319.74786: _execute() done 49915 1727204319.74788: dumping result to json 49915 1727204319.74801: done dumping result, returning 49915 1727204319.74816: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [028d2410-947f-dcd7-b5af-000000000071] 49915 1727204319.74824: sending task result for task 028d2410-947f-dcd7-b5af-000000000071 49915 1727204319.75186: done sending task result for task 028d2410-947f-dcd7-b5af-000000000071 49915 1727204319.75189: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 49915 1727204319.75250: no more pending results, returning what we have 49915 1727204319.75254: results queue empty 49915 1727204319.75255: checking for any_errors_fatal 49915 1727204319.75260: done checking for any_errors_fatal 49915 1727204319.75261: checking for max_fail_percentage 49915 1727204319.75263: done checking for max_fail_percentage 49915 1727204319.75264: checking to see if all hosts have failed and the running result is not ok 49915 1727204319.75265: done checking to see if all hosts have failed 49915 1727204319.75266: getting the remaining hosts for this loop 49915 1727204319.75268: done getting the remaining hosts for this loop 49915 1727204319.75272: getting the next task for host managed-node2 49915 1727204319.75281: done getting next task for host managed-node2 49915 1727204319.75285: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 49915 1727204319.75288: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204319.75478: getting variables 49915 1727204319.75481: in VariableManager get_vars() 49915 1727204319.75552: Calling all_inventory to load vars for managed-node2 49915 1727204319.75555: Calling groups_inventory to load vars for managed-node2 49915 1727204319.75557: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204319.75567: Calling all_plugins_play to load vars for managed-node2 49915 1727204319.75570: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204319.75573: Calling groups_plugins_play to load vars for managed-node2 49915 1727204319.77121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204319.78758: done with get_vars() 49915 1727204319.78784: done getting variables 49915 1727204319.78850: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:58:39 -0400 (0:00:00.702) 0:00:26.495 ***** 49915 1727204319.78884: entering _queue_task() for managed-node2/service 49915 1727204319.79219: worker is 1 (out of 1 available) 49915 1727204319.79230: exiting _queue_task() for managed-node2/service 49915 1727204319.79242: done queuing things up, now waiting for results queue to drain 49915 1727204319.79244: waiting for pending results... 49915 1727204319.79692: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 49915 1727204319.79698: in run() - task 028d2410-947f-dcd7-b5af-000000000072 49915 1727204319.79702: variable 'ansible_search_path' from source: unknown 49915 1727204319.79705: variable 'ansible_search_path' from source: unknown 49915 1727204319.79824: calling self._execute() 49915 1727204319.79836: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204319.79847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204319.79859: variable 'omit' from source: magic vars 49915 1727204319.80264: variable 'ansible_distribution_major_version' from source: facts 49915 1727204319.80283: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204319.80409: variable 'network_provider' from source: set_fact 49915 1727204319.80424: Evaluated conditional (network_provider == "nm"): True 49915 1727204319.80528: variable '__network_wpa_supplicant_required' from source: role '' defaults 49915 1727204319.80626: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 49915 1727204319.80911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49915 1727204319.89283: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49915 1727204319.89362: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49915 1727204319.89406: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49915 1727204319.89452: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49915 1727204319.89484: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49915 1727204319.89578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204319.89610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204319.89642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204319.89697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204319.89719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204319.89777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204319.89808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204319.89840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204319.89893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204319.89982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204319.89985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204319.89988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204319.90019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204319.90062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204319.90081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204319.90240: variable 'network_connections' from source: task vars 49915 1727204319.90256: variable 'interface' from source: play vars 49915 1727204319.90341: variable 'interface' from source: play vars 49915 1727204319.90356: variable 'vlan_interface' from source: play vars 49915 1727204319.90431: variable 'vlan_interface' from source: play vars 49915 1727204319.90538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49915 1727204319.90685: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49915 1727204319.90728: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49915 1727204319.90768: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49915 1727204319.90802: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49915 1727204319.90867: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 49915 1727204319.90885: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 49915 1727204319.90971: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204319.90974: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 49915 1727204319.90989: variable '__network_wireless_connections_defined' from source: role '' defaults 49915 1727204319.91251: variable 'network_connections' from source: task vars 49915 1727204319.91262: variable 'interface' from source: play vars 49915 1727204319.91335: variable 'interface' from source: play vars 49915 1727204319.91347: variable 'vlan_interface' from source: play vars 49915 1727204319.91577: variable 'vlan_interface' from source: play vars 49915 1727204319.91583: Evaluated conditional (__network_wpa_supplicant_required): False 49915 1727204319.91586: when evaluation is False, skipping this task 49915 1727204319.91595: _execute() done 49915 1727204319.91598: dumping result to json 49915 1727204319.91600: done dumping result, returning 49915 1727204319.91602: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [028d2410-947f-dcd7-b5af-000000000072] 49915 1727204319.91604: sending task result for task 028d2410-947f-dcd7-b5af-000000000072 49915 1727204319.91670: done sending task result for task 028d2410-947f-dcd7-b5af-000000000072 49915 1727204319.91673: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 49915 1727204319.91721: no more pending results, returning what we have 49915 1727204319.91724: results queue empty 49915 1727204319.91724: checking for any_errors_fatal 49915 1727204319.91741: done checking for any_errors_fatal 49915 1727204319.91742: checking for max_fail_percentage 49915 1727204319.91743: done checking for max_fail_percentage 49915 1727204319.91744: checking to see if all hosts have failed and the running result is not ok 49915 1727204319.91745: done checking to see if all hosts have failed 49915 1727204319.91746: getting the remaining hosts for this loop 49915 1727204319.91748: done getting the remaining hosts for this loop 49915 1727204319.91752: getting the next task for host managed-node2 49915 1727204319.91759: done getting next task for host managed-node2 49915 1727204319.91763: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 49915 1727204319.91766: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204319.91790: getting variables 49915 1727204319.91793: in VariableManager get_vars() 49915 1727204319.91838: Calling all_inventory to load vars for managed-node2 49915 1727204319.91841: Calling groups_inventory to load vars for managed-node2 49915 1727204319.91844: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204319.91854: Calling all_plugins_play to load vars for managed-node2 49915 1727204319.91856: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204319.91860: Calling groups_plugins_play to load vars for managed-node2 49915 1727204319.98415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204319.99952: done with get_vars() 49915 1727204319.99979: done getting variables 49915 1727204320.00029: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:58:39 -0400 (0:00:00.211) 0:00:26.706 ***** 49915 1727204320.00055: entering _queue_task() for managed-node2/service 49915 1727204320.00407: worker is 1 (out of 1 available) 49915 1727204320.00425: exiting _queue_task() for managed-node2/service 49915 1727204320.00437: done queuing things up, now waiting for results queue to drain 49915 1727204320.00438: waiting for pending results... 49915 1727204320.00772: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 49915 1727204320.00859: in run() - task 028d2410-947f-dcd7-b5af-000000000073 49915 1727204320.00891: variable 'ansible_search_path' from source: unknown 49915 1727204320.00899: variable 'ansible_search_path' from source: unknown 49915 1727204320.00979: calling self._execute() 49915 1727204320.01052: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204320.01065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204320.01086: variable 'omit' from source: magic vars 49915 1727204320.01495: variable 'ansible_distribution_major_version' from source: facts 49915 1727204320.01580: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204320.01650: variable 'network_provider' from source: set_fact 49915 1727204320.01663: Evaluated conditional (network_provider == "initscripts"): False 49915 1727204320.01671: when evaluation is False, skipping this task 49915 1727204320.01682: _execute() done 49915 1727204320.01690: dumping result to json 49915 1727204320.01700: done dumping result, returning 49915 1727204320.01710: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [028d2410-947f-dcd7-b5af-000000000073] 49915 1727204320.01725: sending task result for task 028d2410-947f-dcd7-b5af-000000000073 skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 49915 1727204320.01882: no more pending results, returning what we have 49915 1727204320.01886: results queue empty 49915 1727204320.01887: checking for any_errors_fatal 49915 1727204320.01897: done checking for any_errors_fatal 49915 1727204320.01897: checking for max_fail_percentage 49915 1727204320.01900: done checking for max_fail_percentage 49915 1727204320.01900: checking to see if all hosts have failed and the running result is not ok 49915 1727204320.01902: done checking to see if all hosts have failed 49915 1727204320.01902: getting the remaining hosts for this loop 49915 1727204320.01904: done getting the remaining hosts for this loop 49915 1727204320.01909: getting the next task for host managed-node2 49915 1727204320.01920: done getting next task for host managed-node2 49915 1727204320.01924: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 49915 1727204320.01927: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204320.01948: getting variables 49915 1727204320.01950: in VariableManager get_vars() 49915 1727204320.01996: Calling all_inventory to load vars for managed-node2 49915 1727204320.01999: Calling groups_inventory to load vars for managed-node2 49915 1727204320.02004: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204320.02018: Calling all_plugins_play to load vars for managed-node2 49915 1727204320.02021: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204320.02024: Calling groups_plugins_play to load vars for managed-node2 49915 1727204320.02708: done sending task result for task 028d2410-947f-dcd7-b5af-000000000073 49915 1727204320.02714: WORKER PROCESS EXITING 49915 1727204320.03664: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204320.05346: done with get_vars() 49915 1727204320.05367: done getting variables 49915 1727204320.05433: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:58:40 -0400 (0:00:00.054) 0:00:26.760 ***** 49915 1727204320.05467: entering _queue_task() for managed-node2/copy 49915 1727204320.05820: worker is 1 (out of 1 available) 49915 1727204320.05833: exiting _queue_task() for managed-node2/copy 49915 1727204320.05844: done queuing things up, now waiting for results queue to drain 49915 1727204320.05846: waiting for pending results... 49915 1727204320.06124: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 49915 1727204320.06272: in run() - task 028d2410-947f-dcd7-b5af-000000000074 49915 1727204320.06304: variable 'ansible_search_path' from source: unknown 49915 1727204320.06317: variable 'ansible_search_path' from source: unknown 49915 1727204320.06357: calling self._execute() 49915 1727204320.06456: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204320.06466: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204320.06478: variable 'omit' from source: magic vars 49915 1727204320.06958: variable 'ansible_distribution_major_version' from source: facts 49915 1727204320.06963: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204320.07067: variable 'network_provider' from source: set_fact 49915 1727204320.07081: Evaluated conditional (network_provider == "initscripts"): False 49915 1727204320.07089: when evaluation is False, skipping this task 49915 1727204320.07096: _execute() done 49915 1727204320.07105: dumping result to json 49915 1727204320.07117: done dumping result, returning 49915 1727204320.07129: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [028d2410-947f-dcd7-b5af-000000000074] 49915 1727204320.07177: sending task result for task 028d2410-947f-dcd7-b5af-000000000074 skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 49915 1727204320.07331: no more pending results, returning what we have 49915 1727204320.07335: results queue empty 49915 1727204320.07336: checking for any_errors_fatal 49915 1727204320.07342: done checking for any_errors_fatal 49915 1727204320.07343: checking for max_fail_percentage 49915 1727204320.07345: done checking for max_fail_percentage 49915 1727204320.07346: checking to see if all hosts have failed and the running result is not ok 49915 1727204320.07347: done checking to see if all hosts have failed 49915 1727204320.07348: getting the remaining hosts for this loop 49915 1727204320.07349: done getting the remaining hosts for this loop 49915 1727204320.07353: getting the next task for host managed-node2 49915 1727204320.07361: done getting next task for host managed-node2 49915 1727204320.07365: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 49915 1727204320.07369: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204320.07395: getting variables 49915 1727204320.07397: in VariableManager get_vars() 49915 1727204320.07445: Calling all_inventory to load vars for managed-node2 49915 1727204320.07448: Calling groups_inventory to load vars for managed-node2 49915 1727204320.07451: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204320.07465: Calling all_plugins_play to load vars for managed-node2 49915 1727204320.07468: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204320.07471: Calling groups_plugins_play to load vars for managed-node2 49915 1727204320.08326: done sending task result for task 028d2410-947f-dcd7-b5af-000000000074 49915 1727204320.08329: WORKER PROCESS EXITING 49915 1727204320.09226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204320.10966: done with get_vars() 49915 1727204320.10995: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:58:40 -0400 (0:00:00.056) 0:00:26.817 ***** 49915 1727204320.11100: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 49915 1727204320.11596: worker is 1 (out of 1 available) 49915 1727204320.11608: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 49915 1727204320.11620: done queuing things up, now waiting for results queue to drain 49915 1727204320.11621: waiting for pending results... 49915 1727204320.11822: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 49915 1727204320.11981: in run() - task 028d2410-947f-dcd7-b5af-000000000075 49915 1727204320.12006: variable 'ansible_search_path' from source: unknown 49915 1727204320.12017: variable 'ansible_search_path' from source: unknown 49915 1727204320.12065: calling self._execute() 49915 1727204320.12167: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204320.12183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204320.12197: variable 'omit' from source: magic vars 49915 1727204320.12609: variable 'ansible_distribution_major_version' from source: facts 49915 1727204320.12632: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204320.12644: variable 'omit' from source: magic vars 49915 1727204320.12711: variable 'omit' from source: magic vars 49915 1727204320.12883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49915 1727204320.15068: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49915 1727204320.15106: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49915 1727204320.15153: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49915 1727204320.15200: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49915 1727204320.15234: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49915 1727204320.15336: variable 'network_provider' from source: set_fact 49915 1727204320.15495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204320.15543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204320.15579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204320.15681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204320.15685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204320.15758: variable 'omit' from source: magic vars 49915 1727204320.15889: variable 'omit' from source: magic vars 49915 1727204320.16003: variable 'network_connections' from source: task vars 49915 1727204320.16023: variable 'interface' from source: play vars 49915 1727204320.16096: variable 'interface' from source: play vars 49915 1727204320.16109: variable 'vlan_interface' from source: play vars 49915 1727204320.16263: variable 'vlan_interface' from source: play vars 49915 1727204320.16347: variable 'omit' from source: magic vars 49915 1727204320.16360: variable '__lsr_ansible_managed' from source: task vars 49915 1727204320.16435: variable '__lsr_ansible_managed' from source: task vars 49915 1727204320.16742: Loaded config def from plugin (lookup/template) 49915 1727204320.16752: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 49915 1727204320.16785: File lookup term: get_ansible_managed.j2 49915 1727204320.16793: variable 'ansible_search_path' from source: unknown 49915 1727204320.16803: evaluation_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 49915 1727204320.16878: search_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 49915 1727204320.16883: variable 'ansible_search_path' from source: unknown 49915 1727204320.23284: variable 'ansible_managed' from source: unknown 49915 1727204320.23431: variable 'omit' from source: magic vars 49915 1727204320.23463: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204320.23497: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204320.23523: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204320.23551: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204320.23566: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204320.23647: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204320.23650: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204320.23652: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204320.23722: Set connection var ansible_connection to ssh 49915 1727204320.23730: Set connection var ansible_shell_type to sh 49915 1727204320.23742: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204320.23763: Set connection var ansible_shell_executable to /bin/sh 49915 1727204320.23773: Set connection var ansible_timeout to 10 49915 1727204320.23787: Set connection var ansible_pipelining to False 49915 1727204320.23817: variable 'ansible_shell_executable' from source: unknown 49915 1727204320.23863: variable 'ansible_connection' from source: unknown 49915 1727204320.23866: variable 'ansible_module_compression' from source: unknown 49915 1727204320.23868: variable 'ansible_shell_type' from source: unknown 49915 1727204320.23870: variable 'ansible_shell_executable' from source: unknown 49915 1727204320.23872: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204320.23873: variable 'ansible_pipelining' from source: unknown 49915 1727204320.23875: variable 'ansible_timeout' from source: unknown 49915 1727204320.23878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204320.23987: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 49915 1727204320.24084: variable 'omit' from source: magic vars 49915 1727204320.24088: starting attempt loop 49915 1727204320.24091: running the handler 49915 1727204320.24093: _low_level_execute_command(): starting 49915 1727204320.24096: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49915 1727204320.24820: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204320.24835: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204320.24859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204320.24970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204320.24989: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204320.25016: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204320.25125: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204320.27030: stdout chunk (state=3): >>>/root <<< 49915 1727204320.27154: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204320.27162: stdout chunk (state=3): >>><<< 49915 1727204320.27170: stderr chunk (state=3): >>><<< 49915 1727204320.27190: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204320.27203: _low_level_execute_command(): starting 49915 1727204320.27209: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204320.2719538-51771-20883064663697 `" && echo ansible-tmp-1727204320.2719538-51771-20883064663697="` echo /root/.ansible/tmp/ansible-tmp-1727204320.2719538-51771-20883064663697 `" ) && sleep 0' 49915 1727204320.27617: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204320.27630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204320.27642: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204320.27687: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204320.27701: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204320.27777: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204320.29689: stdout chunk (state=3): >>>ansible-tmp-1727204320.2719538-51771-20883064663697=/root/.ansible/tmp/ansible-tmp-1727204320.2719538-51771-20883064663697 <<< 49915 1727204320.29797: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204320.29822: stderr chunk (state=3): >>><<< 49915 1727204320.29826: stdout chunk (state=3): >>><<< 49915 1727204320.29841: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204320.2719538-51771-20883064663697=/root/.ansible/tmp/ansible-tmp-1727204320.2719538-51771-20883064663697 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204320.29879: variable 'ansible_module_compression' from source: unknown 49915 1727204320.29920: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49915ogiz3nec/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 49915 1727204320.29944: variable 'ansible_facts' from source: unknown 49915 1727204320.30009: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204320.2719538-51771-20883064663697/AnsiballZ_network_connections.py 49915 1727204320.30105: Sending initial data 49915 1727204320.30109: Sent initial data (167 bytes) 49915 1727204320.30536: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204320.30539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 49915 1727204320.30546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204320.30550: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204320.30552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 49915 1727204320.30554: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204320.30597: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204320.30600: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204320.30680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204320.32314: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49915 1727204320.32412: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49915 1727204320.32492: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49915ogiz3nec/tmpesa7y18p /root/.ansible/tmp/ansible-tmp-1727204320.2719538-51771-20883064663697/AnsiballZ_network_connections.py <<< 49915 1727204320.32496: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204320.2719538-51771-20883064663697/AnsiballZ_network_connections.py" <<< 49915 1727204320.32566: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49915ogiz3nec/tmpesa7y18p" to remote "/root/.ansible/tmp/ansible-tmp-1727204320.2719538-51771-20883064663697/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204320.2719538-51771-20883064663697/AnsiballZ_network_connections.py" <<< 49915 1727204320.33683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204320.33686: stderr chunk (state=3): >>><<< 49915 1727204320.33689: stdout chunk (state=3): >>><<< 49915 1727204320.33690: done transferring module to remote 49915 1727204320.33692: _low_level_execute_command(): starting 49915 1727204320.33694: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204320.2719538-51771-20883064663697/ /root/.ansible/tmp/ansible-tmp-1727204320.2719538-51771-20883064663697/AnsiballZ_network_connections.py && sleep 0' 49915 1727204320.34109: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204320.34184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204320.34195: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204320.34222: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204320.34326: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204320.36387: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204320.36391: stdout chunk (state=3): >>><<< 49915 1727204320.36394: stderr chunk (state=3): >>><<< 49915 1727204320.36563: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204320.36567: _low_level_execute_command(): starting 49915 1727204320.36569: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204320.2719538-51771-20883064663697/AnsiballZ_network_connections.py && sleep 0' 49915 1727204320.38141: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204320.38147: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204320.38301: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204320.78311: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ui8lb6fx/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ui8lb6fx/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr101/2409d17b-b636-4a3f-a5ef-a537c98e999e: error=unknown <<< 49915 1727204320.79862: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ui8lb6fx/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ui8lb6fx/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr101.90/6c870b68-6dd2-4763-9564-574ea4efb444: error=unknown <<< 49915 1727204320.80086: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "persistent_state": "absent", "state": "down"}, {"name": "lsr101.90", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "persistent_state": "absent", "state": "down"}, {"name": "lsr101.90", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 49915 1727204320.81960: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 49915 1727204320.82071: stderr chunk (state=3): >>><<< 49915 1727204320.82079: stdout chunk (state=3): >>><<< 49915 1727204320.82082: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ui8lb6fx/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ui8lb6fx/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr101/2409d17b-b636-4a3f-a5ef-a537c98e999e: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ui8lb6fx/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ui8lb6fx/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr101.90/6c870b68-6dd2-4763-9564-574ea4efb444: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "persistent_state": "absent", "state": "down"}, {"name": "lsr101.90", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "persistent_state": "absent", "state": "down"}, {"name": "lsr101.90", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 49915 1727204320.82085: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr101', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'lsr101.90', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204320.2719538-51771-20883064663697/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49915 1727204320.82087: _low_level_execute_command(): starting 49915 1727204320.82089: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204320.2719538-51771-20883064663697/ > /dev/null 2>&1 && sleep 0' 49915 1727204320.82636: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204320.82713: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204320.82718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204320.82751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204320.82762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204320.82835: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204320.82850: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204320.82934: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204320.84868: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204320.84894: stderr chunk (state=3): >>><<< 49915 1727204320.84897: stdout chunk (state=3): >>><<< 49915 1727204320.84911: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204320.84916: handler run complete 49915 1727204320.84937: attempt loop complete, returning result 49915 1727204320.84940: _execute() done 49915 1727204320.84942: dumping result to json 49915 1727204320.84947: done dumping result, returning 49915 1727204320.84958: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [028d2410-947f-dcd7-b5af-000000000075] 49915 1727204320.84960: sending task result for task 028d2410-947f-dcd7-b5af-000000000075 49915 1727204320.85089: done sending task result for task 028d2410-947f-dcd7-b5af-000000000075 49915 1727204320.85092: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr101", "persistent_state": "absent", "state": "down" }, { "name": "lsr101.90", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 49915 1727204320.85192: no more pending results, returning what we have 49915 1727204320.85195: results queue empty 49915 1727204320.85196: checking for any_errors_fatal 49915 1727204320.85201: done checking for any_errors_fatal 49915 1727204320.85202: checking for max_fail_percentage 49915 1727204320.85203: done checking for max_fail_percentage 49915 1727204320.85204: checking to see if all hosts have failed and the running result is not ok 49915 1727204320.85205: done checking to see if all hosts have failed 49915 1727204320.85206: getting the remaining hosts for this loop 49915 1727204320.85207: done getting the remaining hosts for this loop 49915 1727204320.85210: getting the next task for host managed-node2 49915 1727204320.85218: done getting next task for host managed-node2 49915 1727204320.85222: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 49915 1727204320.85224: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204320.85234: getting variables 49915 1727204320.85236: in VariableManager get_vars() 49915 1727204320.85389: Calling all_inventory to load vars for managed-node2 49915 1727204320.85393: Calling groups_inventory to load vars for managed-node2 49915 1727204320.85396: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204320.85405: Calling all_plugins_play to load vars for managed-node2 49915 1727204320.85407: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204320.85409: Calling groups_plugins_play to load vars for managed-node2 49915 1727204320.86760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204320.87741: done with get_vars() 49915 1727204320.87757: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:58:40 -0400 (0:00:00.767) 0:00:27.584 ***** 49915 1727204320.87823: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 49915 1727204320.88068: worker is 1 (out of 1 available) 49915 1727204320.88084: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 49915 1727204320.88095: done queuing things up, now waiting for results queue to drain 49915 1727204320.88097: waiting for pending results... 49915 1727204320.88274: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 49915 1727204320.88362: in run() - task 028d2410-947f-dcd7-b5af-000000000076 49915 1727204320.88379: variable 'ansible_search_path' from source: unknown 49915 1727204320.88382: variable 'ansible_search_path' from source: unknown 49915 1727204320.88412: calling self._execute() 49915 1727204320.88514: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204320.88522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204320.88530: variable 'omit' from source: magic vars 49915 1727204320.89087: variable 'ansible_distribution_major_version' from source: facts 49915 1727204320.89091: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204320.89093: variable 'network_state' from source: role '' defaults 49915 1727204320.89095: Evaluated conditional (network_state != {}): False 49915 1727204320.89098: when evaluation is False, skipping this task 49915 1727204320.89100: _execute() done 49915 1727204320.89102: dumping result to json 49915 1727204320.89104: done dumping result, returning 49915 1727204320.89108: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [028d2410-947f-dcd7-b5af-000000000076] 49915 1727204320.89110: sending task result for task 028d2410-947f-dcd7-b5af-000000000076 49915 1727204320.89168: done sending task result for task 028d2410-947f-dcd7-b5af-000000000076 49915 1727204320.89170: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 49915 1727204320.89235: no more pending results, returning what we have 49915 1727204320.89242: results queue empty 49915 1727204320.89243: checking for any_errors_fatal 49915 1727204320.89251: done checking for any_errors_fatal 49915 1727204320.89252: checking for max_fail_percentage 49915 1727204320.89253: done checking for max_fail_percentage 49915 1727204320.89254: checking to see if all hosts have failed and the running result is not ok 49915 1727204320.89255: done checking to see if all hosts have failed 49915 1727204320.89255: getting the remaining hosts for this loop 49915 1727204320.89257: done getting the remaining hosts for this loop 49915 1727204320.89260: getting the next task for host managed-node2 49915 1727204320.89267: done getting next task for host managed-node2 49915 1727204320.89270: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 49915 1727204320.89273: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204320.89292: getting variables 49915 1727204320.89293: in VariableManager get_vars() 49915 1727204320.89328: Calling all_inventory to load vars for managed-node2 49915 1727204320.89330: Calling groups_inventory to load vars for managed-node2 49915 1727204320.89333: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204320.89341: Calling all_plugins_play to load vars for managed-node2 49915 1727204320.89343: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204320.89345: Calling groups_plugins_play to load vars for managed-node2 49915 1727204320.90536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204320.91566: done with get_vars() 49915 1727204320.91584: done getting variables 49915 1727204320.91629: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:58:40 -0400 (0:00:00.038) 0:00:27.622 ***** 49915 1727204320.91653: entering _queue_task() for managed-node2/debug 49915 1727204320.91891: worker is 1 (out of 1 available) 49915 1727204320.91906: exiting _queue_task() for managed-node2/debug 49915 1727204320.91919: done queuing things up, now waiting for results queue to drain 49915 1727204320.91920: waiting for pending results... 49915 1727204320.92098: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 49915 1727204320.92186: in run() - task 028d2410-947f-dcd7-b5af-000000000077 49915 1727204320.92199: variable 'ansible_search_path' from source: unknown 49915 1727204320.92203: variable 'ansible_search_path' from source: unknown 49915 1727204320.92232: calling self._execute() 49915 1727204320.92304: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204320.92309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204320.92321: variable 'omit' from source: magic vars 49915 1727204320.92596: variable 'ansible_distribution_major_version' from source: facts 49915 1727204320.92606: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204320.92611: variable 'omit' from source: magic vars 49915 1727204320.92649: variable 'omit' from source: magic vars 49915 1727204320.92673: variable 'omit' from source: magic vars 49915 1727204320.92712: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204320.92741: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204320.92757: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204320.92770: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204320.92782: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204320.92835: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204320.92838: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204320.92840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204320.93022: Set connection var ansible_connection to ssh 49915 1727204320.93026: Set connection var ansible_shell_type to sh 49915 1727204320.93028: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204320.93031: Set connection var ansible_shell_executable to /bin/sh 49915 1727204320.93033: Set connection var ansible_timeout to 10 49915 1727204320.93035: Set connection var ansible_pipelining to False 49915 1727204320.93037: variable 'ansible_shell_executable' from source: unknown 49915 1727204320.93039: variable 'ansible_connection' from source: unknown 49915 1727204320.93041: variable 'ansible_module_compression' from source: unknown 49915 1727204320.93043: variable 'ansible_shell_type' from source: unknown 49915 1727204320.93045: variable 'ansible_shell_executable' from source: unknown 49915 1727204320.93047: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204320.93049: variable 'ansible_pipelining' from source: unknown 49915 1727204320.93051: variable 'ansible_timeout' from source: unknown 49915 1727204320.93053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204320.93203: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204320.93220: variable 'omit' from source: magic vars 49915 1727204320.93230: starting attempt loop 49915 1727204320.93235: running the handler 49915 1727204320.93365: variable '__network_connections_result' from source: set_fact 49915 1727204320.93428: handler run complete 49915 1727204320.93451: attempt loop complete, returning result 49915 1727204320.93491: _execute() done 49915 1727204320.93495: dumping result to json 49915 1727204320.93498: done dumping result, returning 49915 1727204320.93500: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [028d2410-947f-dcd7-b5af-000000000077] 49915 1727204320.93502: sending task result for task 028d2410-947f-dcd7-b5af-000000000077 49915 1727204320.93679: done sending task result for task 028d2410-947f-dcd7-b5af-000000000077 49915 1727204320.93683: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "" ] } 49915 1727204320.93750: no more pending results, returning what we have 49915 1727204320.93754: results queue empty 49915 1727204320.93755: checking for any_errors_fatal 49915 1727204320.93762: done checking for any_errors_fatal 49915 1727204320.93763: checking for max_fail_percentage 49915 1727204320.93765: done checking for max_fail_percentage 49915 1727204320.93765: checking to see if all hosts have failed and the running result is not ok 49915 1727204320.93766: done checking to see if all hosts have failed 49915 1727204320.93767: getting the remaining hosts for this loop 49915 1727204320.93769: done getting the remaining hosts for this loop 49915 1727204320.93773: getting the next task for host managed-node2 49915 1727204320.93783: done getting next task for host managed-node2 49915 1727204320.93787: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 49915 1727204320.93791: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204320.93843: getting variables 49915 1727204320.93845: in VariableManager get_vars() 49915 1727204320.93892: Calling all_inventory to load vars for managed-node2 49915 1727204320.93894: Calling groups_inventory to load vars for managed-node2 49915 1727204320.93897: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204320.93937: Calling all_plugins_play to load vars for managed-node2 49915 1727204320.93940: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204320.93943: Calling groups_plugins_play to load vars for managed-node2 49915 1727204320.94855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204320.95703: done with get_vars() 49915 1727204320.95718: done getting variables 49915 1727204320.95760: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:58:40 -0400 (0:00:00.041) 0:00:27.664 ***** 49915 1727204320.95785: entering _queue_task() for managed-node2/debug 49915 1727204320.96015: worker is 1 (out of 1 available) 49915 1727204320.96029: exiting _queue_task() for managed-node2/debug 49915 1727204320.96040: done queuing things up, now waiting for results queue to drain 49915 1727204320.96041: waiting for pending results... 49915 1727204320.96221: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 49915 1727204320.96311: in run() - task 028d2410-947f-dcd7-b5af-000000000078 49915 1727204320.96324: variable 'ansible_search_path' from source: unknown 49915 1727204320.96328: variable 'ansible_search_path' from source: unknown 49915 1727204320.96356: calling self._execute() 49915 1727204320.96431: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204320.96435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204320.96442: variable 'omit' from source: magic vars 49915 1727204320.96715: variable 'ansible_distribution_major_version' from source: facts 49915 1727204320.96728: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204320.96733: variable 'omit' from source: magic vars 49915 1727204320.96771: variable 'omit' from source: magic vars 49915 1727204320.96797: variable 'omit' from source: magic vars 49915 1727204320.96833: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204320.96861: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204320.96877: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204320.96891: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204320.96900: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204320.96929: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204320.96933: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204320.96935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204320.96999: Set connection var ansible_connection to ssh 49915 1727204320.97002: Set connection var ansible_shell_type to sh 49915 1727204320.97010: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204320.97020: Set connection var ansible_shell_executable to /bin/sh 49915 1727204320.97023: Set connection var ansible_timeout to 10 49915 1727204320.97036: Set connection var ansible_pipelining to False 49915 1727204320.97049: variable 'ansible_shell_executable' from source: unknown 49915 1727204320.97052: variable 'ansible_connection' from source: unknown 49915 1727204320.97055: variable 'ansible_module_compression' from source: unknown 49915 1727204320.97058: variable 'ansible_shell_type' from source: unknown 49915 1727204320.97060: variable 'ansible_shell_executable' from source: unknown 49915 1727204320.97062: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204320.97066: variable 'ansible_pipelining' from source: unknown 49915 1727204320.97068: variable 'ansible_timeout' from source: unknown 49915 1727204320.97073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204320.97179: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204320.97187: variable 'omit' from source: magic vars 49915 1727204320.97192: starting attempt loop 49915 1727204320.97195: running the handler 49915 1727204320.97236: variable '__network_connections_result' from source: set_fact 49915 1727204320.97293: variable '__network_connections_result' from source: set_fact 49915 1727204320.97373: handler run complete 49915 1727204320.97392: attempt loop complete, returning result 49915 1727204320.97395: _execute() done 49915 1727204320.97398: dumping result to json 49915 1727204320.97400: done dumping result, returning 49915 1727204320.97408: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [028d2410-947f-dcd7-b5af-000000000078] 49915 1727204320.97412: sending task result for task 028d2410-947f-dcd7-b5af-000000000078 49915 1727204320.97499: done sending task result for task 028d2410-947f-dcd7-b5af-000000000078 49915 1727204320.97501: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr101", "persistent_state": "absent", "state": "down" }, { "name": "lsr101.90", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 49915 1727204320.97586: no more pending results, returning what we have 49915 1727204320.97589: results queue empty 49915 1727204320.97590: checking for any_errors_fatal 49915 1727204320.97594: done checking for any_errors_fatal 49915 1727204320.97595: checking for max_fail_percentage 49915 1727204320.97596: done checking for max_fail_percentage 49915 1727204320.97597: checking to see if all hosts have failed and the running result is not ok 49915 1727204320.97598: done checking to see if all hosts have failed 49915 1727204320.97599: getting the remaining hosts for this loop 49915 1727204320.97600: done getting the remaining hosts for this loop 49915 1727204320.97603: getting the next task for host managed-node2 49915 1727204320.97609: done getting next task for host managed-node2 49915 1727204320.97620: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 49915 1727204320.97623: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204320.97633: getting variables 49915 1727204320.97634: in VariableManager get_vars() 49915 1727204320.97666: Calling all_inventory to load vars for managed-node2 49915 1727204320.97668: Calling groups_inventory to load vars for managed-node2 49915 1727204320.97670: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204320.97680: Calling all_plugins_play to load vars for managed-node2 49915 1727204320.97682: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204320.97685: Calling groups_plugins_play to load vars for managed-node2 49915 1727204320.98428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204320.99299: done with get_vars() 49915 1727204320.99313: done getting variables 49915 1727204320.99354: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:58:40 -0400 (0:00:00.035) 0:00:27.700 ***** 49915 1727204320.99383: entering _queue_task() for managed-node2/debug 49915 1727204320.99595: worker is 1 (out of 1 available) 49915 1727204320.99609: exiting _queue_task() for managed-node2/debug 49915 1727204320.99620: done queuing things up, now waiting for results queue to drain 49915 1727204320.99621: waiting for pending results... 49915 1727204320.99791: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 49915 1727204320.99880: in run() - task 028d2410-947f-dcd7-b5af-000000000079 49915 1727204320.99894: variable 'ansible_search_path' from source: unknown 49915 1727204320.99897: variable 'ansible_search_path' from source: unknown 49915 1727204320.99927: calling self._execute() 49915 1727204320.99999: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204321.00003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204321.00012: variable 'omit' from source: magic vars 49915 1727204321.00285: variable 'ansible_distribution_major_version' from source: facts 49915 1727204321.00294: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204321.00374: variable 'network_state' from source: role '' defaults 49915 1727204321.00387: Evaluated conditional (network_state != {}): False 49915 1727204321.00390: when evaluation is False, skipping this task 49915 1727204321.00395: _execute() done 49915 1727204321.00398: dumping result to json 49915 1727204321.00400: done dumping result, returning 49915 1727204321.00403: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [028d2410-947f-dcd7-b5af-000000000079] 49915 1727204321.00414: sending task result for task 028d2410-947f-dcd7-b5af-000000000079 49915 1727204321.00497: done sending task result for task 028d2410-947f-dcd7-b5af-000000000079 49915 1727204321.00499: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 49915 1727204321.00556: no more pending results, returning what we have 49915 1727204321.00559: results queue empty 49915 1727204321.00560: checking for any_errors_fatal 49915 1727204321.00567: done checking for any_errors_fatal 49915 1727204321.00568: checking for max_fail_percentage 49915 1727204321.00569: done checking for max_fail_percentage 49915 1727204321.00570: checking to see if all hosts have failed and the running result is not ok 49915 1727204321.00571: done checking to see if all hosts have failed 49915 1727204321.00572: getting the remaining hosts for this loop 49915 1727204321.00574: done getting the remaining hosts for this loop 49915 1727204321.00579: getting the next task for host managed-node2 49915 1727204321.00585: done getting next task for host managed-node2 49915 1727204321.00588: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 49915 1727204321.00591: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204321.00606: getting variables 49915 1727204321.00607: in VariableManager get_vars() 49915 1727204321.00639: Calling all_inventory to load vars for managed-node2 49915 1727204321.00641: Calling groups_inventory to load vars for managed-node2 49915 1727204321.00643: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204321.00652: Calling all_plugins_play to load vars for managed-node2 49915 1727204321.00654: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204321.00656: Calling groups_plugins_play to load vars for managed-node2 49915 1727204321.01506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204321.02365: done with get_vars() 49915 1727204321.02381: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:58:41 -0400 (0:00:00.030) 0:00:27.730 ***** 49915 1727204321.02451: entering _queue_task() for managed-node2/ping 49915 1727204321.02669: worker is 1 (out of 1 available) 49915 1727204321.02683: exiting _queue_task() for managed-node2/ping 49915 1727204321.02694: done queuing things up, now waiting for results queue to drain 49915 1727204321.02696: waiting for pending results... 49915 1727204321.02870: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 49915 1727204321.02955: in run() - task 028d2410-947f-dcd7-b5af-00000000007a 49915 1727204321.02966: variable 'ansible_search_path' from source: unknown 49915 1727204321.02970: variable 'ansible_search_path' from source: unknown 49915 1727204321.02999: calling self._execute() 49915 1727204321.03072: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204321.03078: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204321.03085: variable 'omit' from source: magic vars 49915 1727204321.03360: variable 'ansible_distribution_major_version' from source: facts 49915 1727204321.03364: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204321.03370: variable 'omit' from source: magic vars 49915 1727204321.03411: variable 'omit' from source: magic vars 49915 1727204321.03437: variable 'omit' from source: magic vars 49915 1727204321.03470: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204321.03498: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204321.03514: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204321.03530: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204321.03540: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204321.03563: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204321.03566: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204321.03568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204321.03639: Set connection var ansible_connection to ssh 49915 1727204321.03642: Set connection var ansible_shell_type to sh 49915 1727204321.03647: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204321.03655: Set connection var ansible_shell_executable to /bin/sh 49915 1727204321.03660: Set connection var ansible_timeout to 10 49915 1727204321.03666: Set connection var ansible_pipelining to False 49915 1727204321.03689: variable 'ansible_shell_executable' from source: unknown 49915 1727204321.03693: variable 'ansible_connection' from source: unknown 49915 1727204321.03696: variable 'ansible_module_compression' from source: unknown 49915 1727204321.03698: variable 'ansible_shell_type' from source: unknown 49915 1727204321.03701: variable 'ansible_shell_executable' from source: unknown 49915 1727204321.03703: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204321.03705: variable 'ansible_pipelining' from source: unknown 49915 1727204321.03707: variable 'ansible_timeout' from source: unknown 49915 1727204321.03713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204321.03857: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 49915 1727204321.03865: variable 'omit' from source: magic vars 49915 1727204321.03868: starting attempt loop 49915 1727204321.03871: running the handler 49915 1727204321.03887: _low_level_execute_command(): starting 49915 1727204321.03894: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49915 1727204321.04407: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204321.04415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204321.04418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204321.04464: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204321.04467: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204321.04469: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204321.04555: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204321.06289: stdout chunk (state=3): >>>/root <<< 49915 1727204321.06389: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204321.06418: stderr chunk (state=3): >>><<< 49915 1727204321.06422: stdout chunk (state=3): >>><<< 49915 1727204321.06443: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204321.06457: _low_level_execute_command(): starting 49915 1727204321.06460: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204321.0644305-51814-84552287240146 `" && echo ansible-tmp-1727204321.0644305-51814-84552287240146="` echo /root/.ansible/tmp/ansible-tmp-1727204321.0644305-51814-84552287240146 `" ) && sleep 0' 49915 1727204321.06872: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204321.06906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 49915 1727204321.06910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration <<< 49915 1727204321.06919: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 49915 1727204321.06922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204321.06966: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204321.06969: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204321.06974: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204321.07045: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204321.08965: stdout chunk (state=3): >>>ansible-tmp-1727204321.0644305-51814-84552287240146=/root/.ansible/tmp/ansible-tmp-1727204321.0644305-51814-84552287240146 <<< 49915 1727204321.09066: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204321.09093: stderr chunk (state=3): >>><<< 49915 1727204321.09096: stdout chunk (state=3): >>><<< 49915 1727204321.09115: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204321.0644305-51814-84552287240146=/root/.ansible/tmp/ansible-tmp-1727204321.0644305-51814-84552287240146 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204321.09151: variable 'ansible_module_compression' from source: unknown 49915 1727204321.09185: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49915ogiz3nec/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 49915 1727204321.09214: variable 'ansible_facts' from source: unknown 49915 1727204321.09265: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204321.0644305-51814-84552287240146/AnsiballZ_ping.py 49915 1727204321.09357: Sending initial data 49915 1727204321.09360: Sent initial data (152 bytes) 49915 1727204321.09785: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204321.09789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 49915 1727204321.09791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204321.09808: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204321.09851: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204321.09854: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204321.09931: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204321.11505: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 49915 1727204321.11510: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49915 1727204321.11571: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49915 1727204321.11652: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49915ogiz3nec/tmpov_70xep /root/.ansible/tmp/ansible-tmp-1727204321.0644305-51814-84552287240146/AnsiballZ_ping.py <<< 49915 1727204321.11655: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204321.0644305-51814-84552287240146/AnsiballZ_ping.py" <<< 49915 1727204321.11719: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49915ogiz3nec/tmpov_70xep" to remote "/root/.ansible/tmp/ansible-tmp-1727204321.0644305-51814-84552287240146/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204321.0644305-51814-84552287240146/AnsiballZ_ping.py" <<< 49915 1727204321.12328: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204321.12363: stderr chunk (state=3): >>><<< 49915 1727204321.12366: stdout chunk (state=3): >>><<< 49915 1727204321.12408: done transferring module to remote 49915 1727204321.12417: _low_level_execute_command(): starting 49915 1727204321.12421: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204321.0644305-51814-84552287240146/ /root/.ansible/tmp/ansible-tmp-1727204321.0644305-51814-84552287240146/AnsiballZ_ping.py && sleep 0' 49915 1727204321.12848: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204321.12851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204321.12853: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204321.12855: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204321.12857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204321.12908: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204321.12911: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204321.12992: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204321.14781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204321.14803: stderr chunk (state=3): >>><<< 49915 1727204321.14807: stdout chunk (state=3): >>><<< 49915 1727204321.14821: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204321.14824: _low_level_execute_command(): starting 49915 1727204321.14834: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204321.0644305-51814-84552287240146/AnsiballZ_ping.py && sleep 0' 49915 1727204321.15242: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204321.15246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204321.15248: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204321.15250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204321.15301: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204321.15304: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204321.15386: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204321.30388: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 49915 1727204321.31870: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 49915 1727204321.31874: stdout chunk (state=3): >>><<< 49915 1727204321.31880: stderr chunk (state=3): >>><<< 49915 1727204321.31883: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 49915 1727204321.31886: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204321.0644305-51814-84552287240146/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49915 1727204321.31888: _low_level_execute_command(): starting 49915 1727204321.31890: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204321.0644305-51814-84552287240146/ > /dev/null 2>&1 && sleep 0' 49915 1727204321.32306: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204321.32321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204321.32338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204321.32379: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204321.32393: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204321.32471: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204321.34310: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204321.34335: stderr chunk (state=3): >>><<< 49915 1727204321.34338: stdout chunk (state=3): >>><<< 49915 1727204321.34350: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204321.34360: handler run complete 49915 1727204321.34374: attempt loop complete, returning result 49915 1727204321.34379: _execute() done 49915 1727204321.34381: dumping result to json 49915 1727204321.34384: done dumping result, returning 49915 1727204321.34392: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [028d2410-947f-dcd7-b5af-00000000007a] 49915 1727204321.34396: sending task result for task 028d2410-947f-dcd7-b5af-00000000007a 49915 1727204321.34482: done sending task result for task 028d2410-947f-dcd7-b5af-00000000007a 49915 1727204321.34485: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 49915 1727204321.34541: no more pending results, returning what we have 49915 1727204321.34544: results queue empty 49915 1727204321.34545: checking for any_errors_fatal 49915 1727204321.34551: done checking for any_errors_fatal 49915 1727204321.34551: checking for max_fail_percentage 49915 1727204321.34553: done checking for max_fail_percentage 49915 1727204321.34554: checking to see if all hosts have failed and the running result is not ok 49915 1727204321.34555: done checking to see if all hosts have failed 49915 1727204321.34556: getting the remaining hosts for this loop 49915 1727204321.34557: done getting the remaining hosts for this loop 49915 1727204321.34560: getting the next task for host managed-node2 49915 1727204321.34568: done getting next task for host managed-node2 49915 1727204321.34571: ^ task is: TASK: meta (role_complete) 49915 1727204321.34574: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204321.34586: getting variables 49915 1727204321.34588: in VariableManager get_vars() 49915 1727204321.34630: Calling all_inventory to load vars for managed-node2 49915 1727204321.34633: Calling groups_inventory to load vars for managed-node2 49915 1727204321.34635: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204321.34644: Calling all_plugins_play to load vars for managed-node2 49915 1727204321.34647: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204321.34650: Calling groups_plugins_play to load vars for managed-node2 49915 1727204321.35457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204321.36443: done with get_vars() 49915 1727204321.36459: done getting variables 49915 1727204321.36522: done queuing things up, now waiting for results queue to drain 49915 1727204321.36524: results queue empty 49915 1727204321.36525: checking for any_errors_fatal 49915 1727204321.36526: done checking for any_errors_fatal 49915 1727204321.36527: checking for max_fail_percentage 49915 1727204321.36527: done checking for max_fail_percentage 49915 1727204321.36528: checking to see if all hosts have failed and the running result is not ok 49915 1727204321.36528: done checking to see if all hosts have failed 49915 1727204321.36529: getting the remaining hosts for this loop 49915 1727204321.36529: done getting the remaining hosts for this loop 49915 1727204321.36531: getting the next task for host managed-node2 49915 1727204321.36534: done getting next task for host managed-node2 49915 1727204321.36536: ^ task is: TASK: Include the task 'manage_test_interface.yml' 49915 1727204321.36537: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204321.36539: getting variables 49915 1727204321.36539: in VariableManager get_vars() 49915 1727204321.36549: Calling all_inventory to load vars for managed-node2 49915 1727204321.36551: Calling groups_inventory to load vars for managed-node2 49915 1727204321.36552: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204321.36556: Calling all_plugins_play to load vars for managed-node2 49915 1727204321.36557: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204321.36559: Calling groups_plugins_play to load vars for managed-node2 49915 1727204321.37200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204321.38058: done with get_vars() 49915 1727204321.38071: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:73 Tuesday 24 September 2024 14:58:41 -0400 (0:00:00.356) 0:00:28.087 ***** 49915 1727204321.38126: entering _queue_task() for managed-node2/include_tasks 49915 1727204321.38383: worker is 1 (out of 1 available) 49915 1727204321.38397: exiting _queue_task() for managed-node2/include_tasks 49915 1727204321.38409: done queuing things up, now waiting for results queue to drain 49915 1727204321.38410: waiting for pending results... 49915 1727204321.38590: running TaskExecutor() for managed-node2/TASK: Include the task 'manage_test_interface.yml' 49915 1727204321.38655: in run() - task 028d2410-947f-dcd7-b5af-0000000000aa 49915 1727204321.38667: variable 'ansible_search_path' from source: unknown 49915 1727204321.38700: calling self._execute() 49915 1727204321.38774: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204321.38781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204321.38790: variable 'omit' from source: magic vars 49915 1727204321.39063: variable 'ansible_distribution_major_version' from source: facts 49915 1727204321.39076: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204321.39081: _execute() done 49915 1727204321.39084: dumping result to json 49915 1727204321.39087: done dumping result, returning 49915 1727204321.39097: done running TaskExecutor() for managed-node2/TASK: Include the task 'manage_test_interface.yml' [028d2410-947f-dcd7-b5af-0000000000aa] 49915 1727204321.39100: sending task result for task 028d2410-947f-dcd7-b5af-0000000000aa 49915 1727204321.39189: done sending task result for task 028d2410-947f-dcd7-b5af-0000000000aa 49915 1727204321.39192: WORKER PROCESS EXITING 49915 1727204321.39226: no more pending results, returning what we have 49915 1727204321.39231: in VariableManager get_vars() 49915 1727204321.39279: Calling all_inventory to load vars for managed-node2 49915 1727204321.39283: Calling groups_inventory to load vars for managed-node2 49915 1727204321.39285: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204321.39297: Calling all_plugins_play to load vars for managed-node2 49915 1727204321.39299: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204321.39302: Calling groups_plugins_play to load vars for managed-node2 49915 1727204321.40191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204321.41772: done with get_vars() 49915 1727204321.41795: variable 'ansible_search_path' from source: unknown 49915 1727204321.41810: we have included files to process 49915 1727204321.41811: generating all_blocks data 49915 1727204321.41816: done generating all_blocks data 49915 1727204321.41821: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 49915 1727204321.41822: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 49915 1727204321.41824: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 49915 1727204321.42228: in VariableManager get_vars() 49915 1727204321.42252: done with get_vars() 49915 1727204321.42937: done processing included file 49915 1727204321.42940: iterating over new_blocks loaded from include file 49915 1727204321.42941: in VariableManager get_vars() 49915 1727204321.42960: done with get_vars() 49915 1727204321.42962: filtering new block on tags 49915 1727204321.42996: done filtering new block on tags 49915 1727204321.42999: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed-node2 49915 1727204321.43006: extending task lists for all hosts with included blocks 49915 1727204321.45452: done extending task lists 49915 1727204321.45453: done processing included files 49915 1727204321.45454: results queue empty 49915 1727204321.45454: checking for any_errors_fatal 49915 1727204321.45455: done checking for any_errors_fatal 49915 1727204321.45455: checking for max_fail_percentage 49915 1727204321.45456: done checking for max_fail_percentage 49915 1727204321.45457: checking to see if all hosts have failed and the running result is not ok 49915 1727204321.45457: done checking to see if all hosts have failed 49915 1727204321.45458: getting the remaining hosts for this loop 49915 1727204321.45459: done getting the remaining hosts for this loop 49915 1727204321.45460: getting the next task for host managed-node2 49915 1727204321.45463: done getting next task for host managed-node2 49915 1727204321.45465: ^ task is: TASK: Ensure state in ["present", "absent"] 49915 1727204321.45468: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204321.45470: getting variables 49915 1727204321.45471: in VariableManager get_vars() 49915 1727204321.45483: Calling all_inventory to load vars for managed-node2 49915 1727204321.45484: Calling groups_inventory to load vars for managed-node2 49915 1727204321.45486: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204321.45490: Calling all_plugins_play to load vars for managed-node2 49915 1727204321.45492: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204321.45493: Calling groups_plugins_play to load vars for managed-node2 49915 1727204321.46127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204321.47050: done with get_vars() 49915 1727204321.47065: done getting variables 49915 1727204321.47099: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Tuesday 24 September 2024 14:58:41 -0400 (0:00:00.089) 0:00:28.177 ***** 49915 1727204321.47121: entering _queue_task() for managed-node2/fail 49915 1727204321.47377: worker is 1 (out of 1 available) 49915 1727204321.47388: exiting _queue_task() for managed-node2/fail 49915 1727204321.47400: done queuing things up, now waiting for results queue to drain 49915 1727204321.47402: waiting for pending results... 49915 1727204321.47577: running TaskExecutor() for managed-node2/TASK: Ensure state in ["present", "absent"] 49915 1727204321.47641: in run() - task 028d2410-947f-dcd7-b5af-00000000093c 49915 1727204321.47653: variable 'ansible_search_path' from source: unknown 49915 1727204321.47657: variable 'ansible_search_path' from source: unknown 49915 1727204321.47688: calling self._execute() 49915 1727204321.47760: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204321.47765: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204321.47773: variable 'omit' from source: magic vars 49915 1727204321.48052: variable 'ansible_distribution_major_version' from source: facts 49915 1727204321.48064: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204321.48156: variable 'state' from source: include params 49915 1727204321.48159: Evaluated conditional (state not in ["present", "absent"]): False 49915 1727204321.48164: when evaluation is False, skipping this task 49915 1727204321.48168: _execute() done 49915 1727204321.48170: dumping result to json 49915 1727204321.48174: done dumping result, returning 49915 1727204321.48186: done running TaskExecutor() for managed-node2/TASK: Ensure state in ["present", "absent"] [028d2410-947f-dcd7-b5af-00000000093c] 49915 1727204321.48188: sending task result for task 028d2410-947f-dcd7-b5af-00000000093c 49915 1727204321.48264: done sending task result for task 028d2410-947f-dcd7-b5af-00000000093c 49915 1727204321.48267: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 49915 1727204321.48334: no more pending results, returning what we have 49915 1727204321.48337: results queue empty 49915 1727204321.48338: checking for any_errors_fatal 49915 1727204321.48339: done checking for any_errors_fatal 49915 1727204321.48340: checking for max_fail_percentage 49915 1727204321.48342: done checking for max_fail_percentage 49915 1727204321.48343: checking to see if all hosts have failed and the running result is not ok 49915 1727204321.48344: done checking to see if all hosts have failed 49915 1727204321.48345: getting the remaining hosts for this loop 49915 1727204321.48346: done getting the remaining hosts for this loop 49915 1727204321.48350: getting the next task for host managed-node2 49915 1727204321.48356: done getting next task for host managed-node2 49915 1727204321.48358: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 49915 1727204321.48361: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204321.48365: getting variables 49915 1727204321.48366: in VariableManager get_vars() 49915 1727204321.48412: Calling all_inventory to load vars for managed-node2 49915 1727204321.48415: Calling groups_inventory to load vars for managed-node2 49915 1727204321.48417: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204321.48427: Calling all_plugins_play to load vars for managed-node2 49915 1727204321.48429: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204321.48432: Calling groups_plugins_play to load vars for managed-node2 49915 1727204321.49163: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204321.50017: done with get_vars() 49915 1727204321.50031: done getting variables 49915 1727204321.50073: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Tuesday 24 September 2024 14:58:41 -0400 (0:00:00.029) 0:00:28.207 ***** 49915 1727204321.50095: entering _queue_task() for managed-node2/fail 49915 1727204321.50301: worker is 1 (out of 1 available) 49915 1727204321.50316: exiting _queue_task() for managed-node2/fail 49915 1727204321.50329: done queuing things up, now waiting for results queue to drain 49915 1727204321.50330: waiting for pending results... 49915 1727204321.50491: running TaskExecutor() for managed-node2/TASK: Ensure type in ["dummy", "tap", "veth"] 49915 1727204321.50558: in run() - task 028d2410-947f-dcd7-b5af-00000000093d 49915 1727204321.50566: variable 'ansible_search_path' from source: unknown 49915 1727204321.50569: variable 'ansible_search_path' from source: unknown 49915 1727204321.50602: calling self._execute() 49915 1727204321.50670: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204321.50674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204321.50688: variable 'omit' from source: magic vars 49915 1727204321.50960: variable 'ansible_distribution_major_version' from source: facts 49915 1727204321.50970: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204321.51070: variable 'type' from source: play vars 49915 1727204321.51073: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 49915 1727204321.51078: when evaluation is False, skipping this task 49915 1727204321.51082: _execute() done 49915 1727204321.51084: dumping result to json 49915 1727204321.51088: done dumping result, returning 49915 1727204321.51095: done running TaskExecutor() for managed-node2/TASK: Ensure type in ["dummy", "tap", "veth"] [028d2410-947f-dcd7-b5af-00000000093d] 49915 1727204321.51100: sending task result for task 028d2410-947f-dcd7-b5af-00000000093d 49915 1727204321.51179: done sending task result for task 028d2410-947f-dcd7-b5af-00000000093d 49915 1727204321.51182: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 49915 1727204321.51249: no more pending results, returning what we have 49915 1727204321.51253: results queue empty 49915 1727204321.51254: checking for any_errors_fatal 49915 1727204321.51258: done checking for any_errors_fatal 49915 1727204321.51259: checking for max_fail_percentage 49915 1727204321.51261: done checking for max_fail_percentage 49915 1727204321.51262: checking to see if all hosts have failed and the running result is not ok 49915 1727204321.51262: done checking to see if all hosts have failed 49915 1727204321.51263: getting the remaining hosts for this loop 49915 1727204321.51264: done getting the remaining hosts for this loop 49915 1727204321.51267: getting the next task for host managed-node2 49915 1727204321.51272: done getting next task for host managed-node2 49915 1727204321.51274: ^ task is: TASK: Include the task 'show_interfaces.yml' 49915 1727204321.51279: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204321.51282: getting variables 49915 1727204321.51283: in VariableManager get_vars() 49915 1727204321.51319: Calling all_inventory to load vars for managed-node2 49915 1727204321.51322: Calling groups_inventory to load vars for managed-node2 49915 1727204321.51323: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204321.51333: Calling all_plugins_play to load vars for managed-node2 49915 1727204321.51335: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204321.51338: Calling groups_plugins_play to load vars for managed-node2 49915 1727204321.52193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204321.53048: done with get_vars() 49915 1727204321.53062: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Tuesday 24 September 2024 14:58:41 -0400 (0:00:00.030) 0:00:28.237 ***** 49915 1727204321.53132: entering _queue_task() for managed-node2/include_tasks 49915 1727204321.53344: worker is 1 (out of 1 available) 49915 1727204321.53356: exiting _queue_task() for managed-node2/include_tasks 49915 1727204321.53367: done queuing things up, now waiting for results queue to drain 49915 1727204321.53368: waiting for pending results... 49915 1727204321.53530: running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' 49915 1727204321.53597: in run() - task 028d2410-947f-dcd7-b5af-00000000093e 49915 1727204321.53608: variable 'ansible_search_path' from source: unknown 49915 1727204321.53614: variable 'ansible_search_path' from source: unknown 49915 1727204321.53640: calling self._execute() 49915 1727204321.53709: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204321.53715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204321.53724: variable 'omit' from source: magic vars 49915 1727204321.53991: variable 'ansible_distribution_major_version' from source: facts 49915 1727204321.54000: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204321.54005: _execute() done 49915 1727204321.54008: dumping result to json 49915 1727204321.54015: done dumping result, returning 49915 1727204321.54018: done running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' [028d2410-947f-dcd7-b5af-00000000093e] 49915 1727204321.54024: sending task result for task 028d2410-947f-dcd7-b5af-00000000093e 49915 1727204321.54103: done sending task result for task 028d2410-947f-dcd7-b5af-00000000093e 49915 1727204321.54106: WORKER PROCESS EXITING 49915 1727204321.54157: no more pending results, returning what we have 49915 1727204321.54162: in VariableManager get_vars() 49915 1727204321.54202: Calling all_inventory to load vars for managed-node2 49915 1727204321.54204: Calling groups_inventory to load vars for managed-node2 49915 1727204321.54206: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204321.54219: Calling all_plugins_play to load vars for managed-node2 49915 1727204321.54222: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204321.54224: Calling groups_plugins_play to load vars for managed-node2 49915 1727204321.54956: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204321.55817: done with get_vars() 49915 1727204321.55829: variable 'ansible_search_path' from source: unknown 49915 1727204321.55830: variable 'ansible_search_path' from source: unknown 49915 1727204321.55853: we have included files to process 49915 1727204321.55854: generating all_blocks data 49915 1727204321.55855: done generating all_blocks data 49915 1727204321.55858: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 49915 1727204321.55859: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 49915 1727204321.55861: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 49915 1727204321.55931: in VariableManager get_vars() 49915 1727204321.55946: done with get_vars() 49915 1727204321.56022: done processing included file 49915 1727204321.56024: iterating over new_blocks loaded from include file 49915 1727204321.56025: in VariableManager get_vars() 49915 1727204321.56037: done with get_vars() 49915 1727204321.56038: filtering new block on tags 49915 1727204321.56048: done filtering new block on tags 49915 1727204321.56050: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node2 49915 1727204321.56053: extending task lists for all hosts with included blocks 49915 1727204321.56278: done extending task lists 49915 1727204321.56279: done processing included files 49915 1727204321.56280: results queue empty 49915 1727204321.56280: checking for any_errors_fatal 49915 1727204321.56282: done checking for any_errors_fatal 49915 1727204321.56283: checking for max_fail_percentage 49915 1727204321.56283: done checking for max_fail_percentage 49915 1727204321.56284: checking to see if all hosts have failed and the running result is not ok 49915 1727204321.56284: done checking to see if all hosts have failed 49915 1727204321.56285: getting the remaining hosts for this loop 49915 1727204321.56286: done getting the remaining hosts for this loop 49915 1727204321.56287: getting the next task for host managed-node2 49915 1727204321.56290: done getting next task for host managed-node2 49915 1727204321.56291: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 49915 1727204321.56293: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204321.56295: getting variables 49915 1727204321.56295: in VariableManager get_vars() 49915 1727204321.56304: Calling all_inventory to load vars for managed-node2 49915 1727204321.56306: Calling groups_inventory to load vars for managed-node2 49915 1727204321.56307: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204321.56314: Calling all_plugins_play to load vars for managed-node2 49915 1727204321.56315: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204321.56317: Calling groups_plugins_play to load vars for managed-node2 49915 1727204321.57007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204321.57844: done with get_vars() 49915 1727204321.57860: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 14:58:41 -0400 (0:00:00.047) 0:00:28.285 ***** 49915 1727204321.57911: entering _queue_task() for managed-node2/include_tasks 49915 1727204321.58145: worker is 1 (out of 1 available) 49915 1727204321.58158: exiting _queue_task() for managed-node2/include_tasks 49915 1727204321.58169: done queuing things up, now waiting for results queue to drain 49915 1727204321.58171: waiting for pending results... 49915 1727204321.58343: running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' 49915 1727204321.58420: in run() - task 028d2410-947f-dcd7-b5af-000000000aa0 49915 1727204321.58430: variable 'ansible_search_path' from source: unknown 49915 1727204321.58433: variable 'ansible_search_path' from source: unknown 49915 1727204321.58461: calling self._execute() 49915 1727204321.58530: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204321.58535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204321.58543: variable 'omit' from source: magic vars 49915 1727204321.58810: variable 'ansible_distribution_major_version' from source: facts 49915 1727204321.58820: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204321.58827: _execute() done 49915 1727204321.58830: dumping result to json 49915 1727204321.58834: done dumping result, returning 49915 1727204321.58837: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' [028d2410-947f-dcd7-b5af-000000000aa0] 49915 1727204321.58847: sending task result for task 028d2410-947f-dcd7-b5af-000000000aa0 49915 1727204321.58926: done sending task result for task 028d2410-947f-dcd7-b5af-000000000aa0 49915 1727204321.58929: WORKER PROCESS EXITING 49915 1727204321.58978: no more pending results, returning what we have 49915 1727204321.58983: in VariableManager get_vars() 49915 1727204321.59032: Calling all_inventory to load vars for managed-node2 49915 1727204321.59035: Calling groups_inventory to load vars for managed-node2 49915 1727204321.59037: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204321.59047: Calling all_plugins_play to load vars for managed-node2 49915 1727204321.59049: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204321.59052: Calling groups_plugins_play to load vars for managed-node2 49915 1727204321.59803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204321.60740: done with get_vars() 49915 1727204321.60752: variable 'ansible_search_path' from source: unknown 49915 1727204321.60752: variable 'ansible_search_path' from source: unknown 49915 1727204321.60791: we have included files to process 49915 1727204321.60792: generating all_blocks data 49915 1727204321.60794: done generating all_blocks data 49915 1727204321.60795: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 49915 1727204321.60796: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 49915 1727204321.60798: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 49915 1727204321.60971: done processing included file 49915 1727204321.60973: iterating over new_blocks loaded from include file 49915 1727204321.60974: in VariableManager get_vars() 49915 1727204321.60988: done with get_vars() 49915 1727204321.60989: filtering new block on tags 49915 1727204321.61001: done filtering new block on tags 49915 1727204321.61002: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node2 49915 1727204321.61006: extending task lists for all hosts with included blocks 49915 1727204321.61094: done extending task lists 49915 1727204321.61095: done processing included files 49915 1727204321.61095: results queue empty 49915 1727204321.61096: checking for any_errors_fatal 49915 1727204321.61098: done checking for any_errors_fatal 49915 1727204321.61098: checking for max_fail_percentage 49915 1727204321.61099: done checking for max_fail_percentage 49915 1727204321.61100: checking to see if all hosts have failed and the running result is not ok 49915 1727204321.61100: done checking to see if all hosts have failed 49915 1727204321.61101: getting the remaining hosts for this loop 49915 1727204321.61101: done getting the remaining hosts for this loop 49915 1727204321.61103: getting the next task for host managed-node2 49915 1727204321.61107: done getting next task for host managed-node2 49915 1727204321.61108: ^ task is: TASK: Gather current interface info 49915 1727204321.61110: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204321.61114: getting variables 49915 1727204321.61114: in VariableManager get_vars() 49915 1727204321.61124: Calling all_inventory to load vars for managed-node2 49915 1727204321.61126: Calling groups_inventory to load vars for managed-node2 49915 1727204321.61127: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204321.61131: Calling all_plugins_play to load vars for managed-node2 49915 1727204321.61132: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204321.61134: Calling groups_plugins_play to load vars for managed-node2 49915 1727204321.61763: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204321.62604: done with get_vars() 49915 1727204321.62621: done getting variables 49915 1727204321.62651: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 14:58:41 -0400 (0:00:00.047) 0:00:28.332 ***** 49915 1727204321.62673: entering _queue_task() for managed-node2/command 49915 1727204321.62922: worker is 1 (out of 1 available) 49915 1727204321.62935: exiting _queue_task() for managed-node2/command 49915 1727204321.62948: done queuing things up, now waiting for results queue to drain 49915 1727204321.62949: waiting for pending results... 49915 1727204321.63122: running TaskExecutor() for managed-node2/TASK: Gather current interface info 49915 1727204321.63201: in run() - task 028d2410-947f-dcd7-b5af-000000000ad7 49915 1727204321.63211: variable 'ansible_search_path' from source: unknown 49915 1727204321.63218: variable 'ansible_search_path' from source: unknown 49915 1727204321.63244: calling self._execute() 49915 1727204321.63315: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204321.63318: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204321.63326: variable 'omit' from source: magic vars 49915 1727204321.63599: variable 'ansible_distribution_major_version' from source: facts 49915 1727204321.63615: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204321.63618: variable 'omit' from source: magic vars 49915 1727204321.63648: variable 'omit' from source: magic vars 49915 1727204321.63673: variable 'omit' from source: magic vars 49915 1727204321.63705: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204321.63736: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204321.63751: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204321.63764: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204321.63774: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204321.63799: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204321.63802: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204321.63805: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204321.63872: Set connection var ansible_connection to ssh 49915 1727204321.63877: Set connection var ansible_shell_type to sh 49915 1727204321.63882: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204321.63890: Set connection var ansible_shell_executable to /bin/sh 49915 1727204321.63895: Set connection var ansible_timeout to 10 49915 1727204321.63901: Set connection var ansible_pipelining to False 49915 1727204321.63920: variable 'ansible_shell_executable' from source: unknown 49915 1727204321.63923: variable 'ansible_connection' from source: unknown 49915 1727204321.63926: variable 'ansible_module_compression' from source: unknown 49915 1727204321.63928: variable 'ansible_shell_type' from source: unknown 49915 1727204321.63930: variable 'ansible_shell_executable' from source: unknown 49915 1727204321.63934: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204321.63936: variable 'ansible_pipelining' from source: unknown 49915 1727204321.63938: variable 'ansible_timeout' from source: unknown 49915 1727204321.63949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204321.64043: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204321.64052: variable 'omit' from source: magic vars 49915 1727204321.64061: starting attempt loop 49915 1727204321.64064: running the handler 49915 1727204321.64074: _low_level_execute_command(): starting 49915 1727204321.64083: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49915 1727204321.64571: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204321.64609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 49915 1727204321.64613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204321.64615: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204321.64618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204321.64666: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204321.64669: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204321.64671: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204321.64756: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204321.66479: stdout chunk (state=3): >>>/root <<< 49915 1727204321.66585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204321.66615: stderr chunk (state=3): >>><<< 49915 1727204321.66618: stdout chunk (state=3): >>><<< 49915 1727204321.66635: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204321.66654: _low_level_execute_command(): starting 49915 1727204321.66658: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204321.6663532-51842-203321649447448 `" && echo ansible-tmp-1727204321.6663532-51842-203321649447448="` echo /root/.ansible/tmp/ansible-tmp-1727204321.6663532-51842-203321649447448 `" ) && sleep 0' 49915 1727204321.67072: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204321.67114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204321.67117: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204321.67127: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204321.67129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 49915 1727204321.67132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204321.67173: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204321.67181: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204321.67184: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204321.67252: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204321.69197: stdout chunk (state=3): >>>ansible-tmp-1727204321.6663532-51842-203321649447448=/root/.ansible/tmp/ansible-tmp-1727204321.6663532-51842-203321649447448 <<< 49915 1727204321.69302: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204321.69330: stderr chunk (state=3): >>><<< 49915 1727204321.69333: stdout chunk (state=3): >>><<< 49915 1727204321.69346: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204321.6663532-51842-203321649447448=/root/.ansible/tmp/ansible-tmp-1727204321.6663532-51842-203321649447448 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204321.69375: variable 'ansible_module_compression' from source: unknown 49915 1727204321.69415: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49915ogiz3nec/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 49915 1727204321.69445: variable 'ansible_facts' from source: unknown 49915 1727204321.69503: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204321.6663532-51842-203321649447448/AnsiballZ_command.py 49915 1727204321.69605: Sending initial data 49915 1727204321.69608: Sent initial data (156 bytes) 49915 1727204321.70056: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204321.70059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 49915 1727204321.70062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204321.70064: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204321.70066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 49915 1727204321.70068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204321.70122: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204321.70130: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204321.70133: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204321.70201: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204321.71803: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 49915 1727204321.71808: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49915 1727204321.71870: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49915 1727204321.71941: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49915ogiz3nec/tmp4t5tij1s /root/.ansible/tmp/ansible-tmp-1727204321.6663532-51842-203321649447448/AnsiballZ_command.py <<< 49915 1727204321.71946: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204321.6663532-51842-203321649447448/AnsiballZ_command.py" <<< 49915 1727204321.72013: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49915ogiz3nec/tmp4t5tij1s" to remote "/root/.ansible/tmp/ansible-tmp-1727204321.6663532-51842-203321649447448/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204321.6663532-51842-203321649447448/AnsiballZ_command.py" <<< 49915 1727204321.72663: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204321.72705: stderr chunk (state=3): >>><<< 49915 1727204321.72709: stdout chunk (state=3): >>><<< 49915 1727204321.72751: done transferring module to remote 49915 1727204321.72760: _low_level_execute_command(): starting 49915 1727204321.72764: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204321.6663532-51842-203321649447448/ /root/.ansible/tmp/ansible-tmp-1727204321.6663532-51842-203321649447448/AnsiballZ_command.py && sleep 0' 49915 1727204321.73202: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204321.73205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 49915 1727204321.73207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 49915 1727204321.73213: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204321.73215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204321.73257: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204321.73269: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204321.73348: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204321.75169: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204321.75193: stderr chunk (state=3): >>><<< 49915 1727204321.75196: stdout chunk (state=3): >>><<< 49915 1727204321.75210: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204321.75215: _low_level_execute_command(): starting 49915 1727204321.75218: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204321.6663532-51842-203321649447448/AnsiballZ_command.py && sleep 0' 49915 1727204321.75654: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204321.75657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204321.75660: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204321.75662: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204321.75664: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found <<< 49915 1727204321.75665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204321.75716: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204321.75719: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204321.75807: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204321.91363: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nlsr101\npeerlsr101", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:58:41.908856", "end": "2024-09-24 14:58:41.912213", "delta": "0:00:00.003357", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 49915 1727204321.93124: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 49915 1727204321.93128: stdout chunk (state=3): >>><<< 49915 1727204321.93131: stderr chunk (state=3): >>><<< 49915 1727204321.93133: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nlsr101\npeerlsr101", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:58:41.908856", "end": "2024-09-24 14:58:41.912213", "delta": "0:00:00.003357", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 49915 1727204321.93388: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204321.6663532-51842-203321649447448/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49915 1727204321.93394: _low_level_execute_command(): starting 49915 1727204321.93396: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204321.6663532-51842-203321649447448/ > /dev/null 2>&1 && sleep 0' 49915 1727204321.94491: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204321.94642: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204321.94797: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204321.94822: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204321.94984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204321.97058: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204321.97062: stdout chunk (state=3): >>><<< 49915 1727204321.97064: stderr chunk (state=3): >>><<< 49915 1727204321.97084: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204321.97095: handler run complete 49915 1727204321.97124: Evaluated conditional (False): False 49915 1727204321.97481: attempt loop complete, returning result 49915 1727204321.97484: _execute() done 49915 1727204321.97486: dumping result to json 49915 1727204321.97488: done dumping result, returning 49915 1727204321.97490: done running TaskExecutor() for managed-node2/TASK: Gather current interface info [028d2410-947f-dcd7-b5af-000000000ad7] 49915 1727204321.97492: sending task result for task 028d2410-947f-dcd7-b5af-000000000ad7 49915 1727204321.97560: done sending task result for task 028d2410-947f-dcd7-b5af-000000000ad7 49915 1727204321.97563: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003357", "end": "2024-09-24 14:58:41.912213", "rc": 0, "start": "2024-09-24 14:58:41.908856" } STDOUT: bonding_masters eth0 lo lsr101 peerlsr101 49915 1727204321.97639: no more pending results, returning what we have 49915 1727204321.97643: results queue empty 49915 1727204321.97643: checking for any_errors_fatal 49915 1727204321.97645: done checking for any_errors_fatal 49915 1727204321.97645: checking for max_fail_percentage 49915 1727204321.97647: done checking for max_fail_percentage 49915 1727204321.97648: checking to see if all hosts have failed and the running result is not ok 49915 1727204321.97649: done checking to see if all hosts have failed 49915 1727204321.97650: getting the remaining hosts for this loop 49915 1727204321.97651: done getting the remaining hosts for this loop 49915 1727204321.97656: getting the next task for host managed-node2 49915 1727204321.97664: done getting next task for host managed-node2 49915 1727204321.97666: ^ task is: TASK: Set current_interfaces 49915 1727204321.97672: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204321.97678: getting variables 49915 1727204321.97680: in VariableManager get_vars() 49915 1727204321.97725: Calling all_inventory to load vars for managed-node2 49915 1727204321.97728: Calling groups_inventory to load vars for managed-node2 49915 1727204321.97730: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204321.97740: Calling all_plugins_play to load vars for managed-node2 49915 1727204321.97743: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204321.97745: Calling groups_plugins_play to load vars for managed-node2 49915 1727204322.00909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204322.11654: done with get_vars() 49915 1727204322.11686: done getting variables 49915 1727204322.11862: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 14:58:42 -0400 (0:00:00.492) 0:00:28.825 ***** 49915 1727204322.11893: entering _queue_task() for managed-node2/set_fact 49915 1727204322.12369: worker is 1 (out of 1 available) 49915 1727204322.12383: exiting _queue_task() for managed-node2/set_fact 49915 1727204322.12395: done queuing things up, now waiting for results queue to drain 49915 1727204322.12397: waiting for pending results... 49915 1727204322.12635: running TaskExecutor() for managed-node2/TASK: Set current_interfaces 49915 1727204322.12748: in run() - task 028d2410-947f-dcd7-b5af-000000000ad8 49915 1727204322.12766: variable 'ansible_search_path' from source: unknown 49915 1727204322.12838: variable 'ansible_search_path' from source: unknown 49915 1727204322.12843: calling self._execute() 49915 1727204322.12925: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204322.12937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204322.12950: variable 'omit' from source: magic vars 49915 1727204322.13379: variable 'ansible_distribution_major_version' from source: facts 49915 1727204322.13400: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204322.13414: variable 'omit' from source: magic vars 49915 1727204322.13499: variable 'omit' from source: magic vars 49915 1727204322.13602: variable '_current_interfaces' from source: set_fact 49915 1727204322.13722: variable 'omit' from source: magic vars 49915 1727204322.13743: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204322.13798: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204322.13832: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204322.13854: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204322.13983: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204322.13987: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204322.13990: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204322.13993: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204322.14044: Set connection var ansible_connection to ssh 49915 1727204322.14052: Set connection var ansible_shell_type to sh 49915 1727204322.14064: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204322.14080: Set connection var ansible_shell_executable to /bin/sh 49915 1727204322.14096: Set connection var ansible_timeout to 10 49915 1727204322.14116: Set connection var ansible_pipelining to False 49915 1727204322.14141: variable 'ansible_shell_executable' from source: unknown 49915 1727204322.14149: variable 'ansible_connection' from source: unknown 49915 1727204322.14156: variable 'ansible_module_compression' from source: unknown 49915 1727204322.14163: variable 'ansible_shell_type' from source: unknown 49915 1727204322.14170: variable 'ansible_shell_executable' from source: unknown 49915 1727204322.14179: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204322.14187: variable 'ansible_pipelining' from source: unknown 49915 1727204322.14219: variable 'ansible_timeout' from source: unknown 49915 1727204322.14222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204322.14375: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204322.14420: variable 'omit' from source: magic vars 49915 1727204322.14422: starting attempt loop 49915 1727204322.14425: running the handler 49915 1727204322.14431: handler run complete 49915 1727204322.14447: attempt loop complete, returning result 49915 1727204322.14452: _execute() done 49915 1727204322.14457: dumping result to json 49915 1727204322.14464: done dumping result, returning 49915 1727204322.14472: done running TaskExecutor() for managed-node2/TASK: Set current_interfaces [028d2410-947f-dcd7-b5af-000000000ad8] 49915 1727204322.14526: sending task result for task 028d2410-947f-dcd7-b5af-000000000ad8 ok: [managed-node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "lsr101", "peerlsr101" ] }, "changed": false } 49915 1727204322.14729: no more pending results, returning what we have 49915 1727204322.14733: results queue empty 49915 1727204322.14734: checking for any_errors_fatal 49915 1727204322.14789: done checking for any_errors_fatal 49915 1727204322.14791: checking for max_fail_percentage 49915 1727204322.14793: done checking for max_fail_percentage 49915 1727204322.14794: checking to see if all hosts have failed and the running result is not ok 49915 1727204322.14795: done checking to see if all hosts have failed 49915 1727204322.14795: getting the remaining hosts for this loop 49915 1727204322.14797: done getting the remaining hosts for this loop 49915 1727204322.14804: getting the next task for host managed-node2 49915 1727204322.14816: done getting next task for host managed-node2 49915 1727204322.14819: ^ task is: TASK: Show current_interfaces 49915 1727204322.14825: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204322.14830: getting variables 49915 1727204322.14832: in VariableManager get_vars() 49915 1727204322.14997: Calling all_inventory to load vars for managed-node2 49915 1727204322.15000: Calling groups_inventory to load vars for managed-node2 49915 1727204322.15003: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204322.15026: done sending task result for task 028d2410-947f-dcd7-b5af-000000000ad8 49915 1727204322.15030: WORKER PROCESS EXITING 49915 1727204322.15041: Calling all_plugins_play to load vars for managed-node2 49915 1727204322.15044: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204322.15048: Calling groups_plugins_play to load vars for managed-node2 49915 1727204322.17305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204322.19017: done with get_vars() 49915 1727204322.19042: done getting variables 49915 1727204322.19111: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 14:58:42 -0400 (0:00:00.072) 0:00:28.897 ***** 49915 1727204322.19147: entering _queue_task() for managed-node2/debug 49915 1727204322.19696: worker is 1 (out of 1 available) 49915 1727204322.19707: exiting _queue_task() for managed-node2/debug 49915 1727204322.19720: done queuing things up, now waiting for results queue to drain 49915 1727204322.19721: waiting for pending results... 49915 1727204322.19878: running TaskExecutor() for managed-node2/TASK: Show current_interfaces 49915 1727204322.20006: in run() - task 028d2410-947f-dcd7-b5af-000000000aa1 49915 1727204322.20030: variable 'ansible_search_path' from source: unknown 49915 1727204322.20037: variable 'ansible_search_path' from source: unknown 49915 1727204322.20089: calling self._execute() 49915 1727204322.20197: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204322.20279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204322.20286: variable 'omit' from source: magic vars 49915 1727204322.20656: variable 'ansible_distribution_major_version' from source: facts 49915 1727204322.20674: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204322.20690: variable 'omit' from source: magic vars 49915 1727204322.20771: variable 'omit' from source: magic vars 49915 1727204322.20933: variable 'current_interfaces' from source: set_fact 49915 1727204322.20969: variable 'omit' from source: magic vars 49915 1727204322.21020: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204322.21149: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204322.21153: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204322.21156: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204322.21158: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204322.21171: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204322.21485: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204322.21489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204322.21591: Set connection var ansible_connection to ssh 49915 1727204322.21594: Set connection var ansible_shell_type to sh 49915 1727204322.21596: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204322.21599: Set connection var ansible_shell_executable to /bin/sh 49915 1727204322.21601: Set connection var ansible_timeout to 10 49915 1727204322.21603: Set connection var ansible_pipelining to False 49915 1727204322.21605: variable 'ansible_shell_executable' from source: unknown 49915 1727204322.21607: variable 'ansible_connection' from source: unknown 49915 1727204322.21609: variable 'ansible_module_compression' from source: unknown 49915 1727204322.21611: variable 'ansible_shell_type' from source: unknown 49915 1727204322.21616: variable 'ansible_shell_executable' from source: unknown 49915 1727204322.21618: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204322.21620: variable 'ansible_pipelining' from source: unknown 49915 1727204322.21622: variable 'ansible_timeout' from source: unknown 49915 1727204322.21624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204322.21951: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204322.21972: variable 'omit' from source: magic vars 49915 1727204322.22068: starting attempt loop 49915 1727204322.22071: running the handler 49915 1727204322.22155: handler run complete 49915 1727204322.22284: attempt loop complete, returning result 49915 1727204322.22287: _execute() done 49915 1727204322.22290: dumping result to json 49915 1727204322.22292: done dumping result, returning 49915 1727204322.22295: done running TaskExecutor() for managed-node2/TASK: Show current_interfaces [028d2410-947f-dcd7-b5af-000000000aa1] 49915 1727204322.22297: sending task result for task 028d2410-947f-dcd7-b5af-000000000aa1 49915 1727204322.22530: done sending task result for task 028d2410-947f-dcd7-b5af-000000000aa1 49915 1727204322.22533: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'lsr101', 'peerlsr101'] 49915 1727204322.22587: no more pending results, returning what we have 49915 1727204322.22590: results queue empty 49915 1727204322.22592: checking for any_errors_fatal 49915 1727204322.22596: done checking for any_errors_fatal 49915 1727204322.22597: checking for max_fail_percentage 49915 1727204322.22599: done checking for max_fail_percentage 49915 1727204322.22600: checking to see if all hosts have failed and the running result is not ok 49915 1727204322.22601: done checking to see if all hosts have failed 49915 1727204322.22678: getting the remaining hosts for this loop 49915 1727204322.22681: done getting the remaining hosts for this loop 49915 1727204322.22686: getting the next task for host managed-node2 49915 1727204322.22696: done getting next task for host managed-node2 49915 1727204322.22699: ^ task is: TASK: Install iproute 49915 1727204322.22702: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204322.22707: getting variables 49915 1727204322.22709: in VariableManager get_vars() 49915 1727204322.22762: Calling all_inventory to load vars for managed-node2 49915 1727204322.22766: Calling groups_inventory to load vars for managed-node2 49915 1727204322.22768: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204322.22831: Calling all_plugins_play to load vars for managed-node2 49915 1727204322.22836: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204322.22839: Calling groups_plugins_play to load vars for managed-node2 49915 1727204322.24672: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204322.26556: done with get_vars() 49915 1727204322.26585: done getting variables 49915 1727204322.26655: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Tuesday 24 September 2024 14:58:42 -0400 (0:00:00.075) 0:00:28.973 ***** 49915 1727204322.26694: entering _queue_task() for managed-node2/package 49915 1727204322.27132: worker is 1 (out of 1 available) 49915 1727204322.27145: exiting _queue_task() for managed-node2/package 49915 1727204322.27157: done queuing things up, now waiting for results queue to drain 49915 1727204322.27158: waiting for pending results... 49915 1727204322.27463: running TaskExecutor() for managed-node2/TASK: Install iproute 49915 1727204322.27683: in run() - task 028d2410-947f-dcd7-b5af-00000000093f 49915 1727204322.27688: variable 'ansible_search_path' from source: unknown 49915 1727204322.27691: variable 'ansible_search_path' from source: unknown 49915 1727204322.27694: calling self._execute() 49915 1727204322.27760: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204322.27772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204322.27788: variable 'omit' from source: magic vars 49915 1727204322.28202: variable 'ansible_distribution_major_version' from source: facts 49915 1727204322.28224: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204322.28234: variable 'omit' from source: magic vars 49915 1727204322.28281: variable 'omit' from source: magic vars 49915 1727204322.28494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 49915 1727204322.30747: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 49915 1727204322.30834: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 49915 1727204322.30980: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 49915 1727204322.30983: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 49915 1727204322.30986: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 49915 1727204322.31060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 49915 1727204322.31101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 49915 1727204322.31136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 49915 1727204322.31186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 49915 1727204322.31216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 49915 1727204322.31336: variable '__network_is_ostree' from source: set_fact 49915 1727204322.31346: variable 'omit' from source: magic vars 49915 1727204322.31383: variable 'omit' from source: magic vars 49915 1727204322.31537: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204322.31540: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204322.31543: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204322.31545: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204322.31547: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204322.31555: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204322.31563: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204322.31570: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204322.31687: Set connection var ansible_connection to ssh 49915 1727204322.31695: Set connection var ansible_shell_type to sh 49915 1727204322.31706: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204322.31723: Set connection var ansible_shell_executable to /bin/sh 49915 1727204322.31732: Set connection var ansible_timeout to 10 49915 1727204322.31742: Set connection var ansible_pipelining to False 49915 1727204322.31780: variable 'ansible_shell_executable' from source: unknown 49915 1727204322.31788: variable 'ansible_connection' from source: unknown 49915 1727204322.31794: variable 'ansible_module_compression' from source: unknown 49915 1727204322.31801: variable 'ansible_shell_type' from source: unknown 49915 1727204322.31807: variable 'ansible_shell_executable' from source: unknown 49915 1727204322.31817: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204322.31864: variable 'ansible_pipelining' from source: unknown 49915 1727204322.31867: variable 'ansible_timeout' from source: unknown 49915 1727204322.31869: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204322.31950: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204322.31973: variable 'omit' from source: magic vars 49915 1727204322.31988: starting attempt loop 49915 1727204322.31994: running the handler 49915 1727204322.32006: variable 'ansible_facts' from source: unknown 49915 1727204322.32083: variable 'ansible_facts' from source: unknown 49915 1727204322.32088: _low_level_execute_command(): starting 49915 1727204322.32091: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49915 1727204322.32888: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204322.32978: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204322.33015: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204322.33033: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204322.33054: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204322.33149: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204322.34848: stdout chunk (state=3): >>>/root <<< 49915 1727204322.34995: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204322.35062: stderr chunk (state=3): >>><<< 49915 1727204322.35066: stdout chunk (state=3): >>><<< 49915 1727204322.35089: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204322.35210: _low_level_execute_command(): starting 49915 1727204322.35218: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204322.3510625-51867-171394587075932 `" && echo ansible-tmp-1727204322.3510625-51867-171394587075932="` echo /root/.ansible/tmp/ansible-tmp-1727204322.3510625-51867-171394587075932 `" ) && sleep 0' 49915 1727204322.35823: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204322.35836: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204322.35851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204322.35868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204322.35887: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 49915 1727204322.35898: stderr chunk (state=3): >>>debug2: match not found <<< 49915 1727204322.35941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204322.36015: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204322.36036: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204322.36061: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204322.36170: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204322.38052: stdout chunk (state=3): >>>ansible-tmp-1727204322.3510625-51867-171394587075932=/root/.ansible/tmp/ansible-tmp-1727204322.3510625-51867-171394587075932 <<< 49915 1727204322.38229: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204322.38232: stdout chunk (state=3): >>><<< 49915 1727204322.38235: stderr chunk (state=3): >>><<< 49915 1727204322.38253: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204322.3510625-51867-171394587075932=/root/.ansible/tmp/ansible-tmp-1727204322.3510625-51867-171394587075932 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204322.38294: variable 'ansible_module_compression' from source: unknown 49915 1727204322.38388: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49915ogiz3nec/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 49915 1727204322.38446: variable 'ansible_facts' from source: unknown 49915 1727204322.38669: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204322.3510625-51867-171394587075932/AnsiballZ_dnf.py 49915 1727204322.38810: Sending initial data 49915 1727204322.38816: Sent initial data (152 bytes) 49915 1727204322.39280: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204322.39294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 49915 1727204322.39305: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204322.39348: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204322.39360: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204322.39439: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204322.40983: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 49915 1727204322.41011: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49915 1727204322.41079: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49915 1727204322.41180: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49915ogiz3nec/tmp036jjz36 /root/.ansible/tmp/ansible-tmp-1727204322.3510625-51867-171394587075932/AnsiballZ_dnf.py <<< 49915 1727204322.41184: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204322.3510625-51867-171394587075932/AnsiballZ_dnf.py" <<< 49915 1727204322.41244: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49915ogiz3nec/tmp036jjz36" to remote "/root/.ansible/tmp/ansible-tmp-1727204322.3510625-51867-171394587075932/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204322.3510625-51867-171394587075932/AnsiballZ_dnf.py" <<< 49915 1727204322.42248: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204322.42251: stderr chunk (state=3): >>><<< 49915 1727204322.42254: stdout chunk (state=3): >>><<< 49915 1727204322.42272: done transferring module to remote 49915 1727204322.42284: _low_level_execute_command(): starting 49915 1727204322.42289: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204322.3510625-51867-171394587075932/ /root/.ansible/tmp/ansible-tmp-1727204322.3510625-51867-171394587075932/AnsiballZ_dnf.py && sleep 0' 49915 1727204322.42727: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204322.42730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 49915 1727204322.42733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204322.42736: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204322.42737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204322.42787: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204322.42791: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204322.42864: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204322.44781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204322.44785: stdout chunk (state=3): >>><<< 49915 1727204322.44787: stderr chunk (state=3): >>><<< 49915 1727204322.44790: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204322.44792: _low_level_execute_command(): starting 49915 1727204322.44795: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204322.3510625-51867-171394587075932/AnsiballZ_dnf.py && sleep 0' 49915 1727204322.45502: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204322.45507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204322.45527: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204322.45532: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204322.45590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204322.45626: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204322.45653: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204322.45732: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204322.87120: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 49915 1727204322.91353: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 49915 1727204322.91357: stdout chunk (state=3): >>><<< 49915 1727204322.91359: stderr chunk (state=3): >>><<< 49915 1727204322.91399: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 49915 1727204322.91433: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204322.3510625-51867-171394587075932/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49915 1727204322.91464: _low_level_execute_command(): starting 49915 1727204322.91467: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204322.3510625-51867-171394587075932/ > /dev/null 2>&1 && sleep 0' 49915 1727204322.92968: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204322.93194: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204322.93298: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204322.95174: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204322.95270: stdout chunk (state=3): >>><<< 49915 1727204322.95273: stderr chunk (state=3): >>><<< 49915 1727204322.95277: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204322.95280: handler run complete 49915 1727204322.95374: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 49915 1727204322.95892: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 49915 1727204322.95934: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 49915 1727204322.95966: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 49915 1727204322.95999: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 49915 1727204322.96078: variable '__install_status' from source: set_fact 49915 1727204322.96298: Evaluated conditional (__install_status is success): True 49915 1727204322.96313: attempt loop complete, returning result 49915 1727204322.96318: _execute() done 49915 1727204322.96321: dumping result to json 49915 1727204322.96327: done dumping result, returning 49915 1727204322.96335: done running TaskExecutor() for managed-node2/TASK: Install iproute [028d2410-947f-dcd7-b5af-00000000093f] 49915 1727204322.96359: sending task result for task 028d2410-947f-dcd7-b5af-00000000093f 49915 1727204322.96440: done sending task result for task 028d2410-947f-dcd7-b5af-00000000093f 49915 1727204322.96443: WORKER PROCESS EXITING ok: [managed-node2] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 49915 1727204322.96552: no more pending results, returning what we have 49915 1727204322.96556: results queue empty 49915 1727204322.96557: checking for any_errors_fatal 49915 1727204322.96564: done checking for any_errors_fatal 49915 1727204322.96565: checking for max_fail_percentage 49915 1727204322.96567: done checking for max_fail_percentage 49915 1727204322.96568: checking to see if all hosts have failed and the running result is not ok 49915 1727204322.96569: done checking to see if all hosts have failed 49915 1727204322.96569: getting the remaining hosts for this loop 49915 1727204322.96571: done getting the remaining hosts for this loop 49915 1727204322.96578: getting the next task for host managed-node2 49915 1727204322.96585: done getting next task for host managed-node2 49915 1727204322.96588: ^ task is: TASK: Create veth interface {{ interface }} 49915 1727204322.96590: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204322.96594: getting variables 49915 1727204322.96597: in VariableManager get_vars() 49915 1727204322.96642: Calling all_inventory to load vars for managed-node2 49915 1727204322.96645: Calling groups_inventory to load vars for managed-node2 49915 1727204322.96647: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204322.96659: Calling all_plugins_play to load vars for managed-node2 49915 1727204322.96662: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204322.96665: Calling groups_plugins_play to load vars for managed-node2 49915 1727204322.99426: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204323.03380: done with get_vars() 49915 1727204323.03411: done getting variables 49915 1727204323.03470: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49915 1727204323.03596: variable 'interface' from source: play vars TASK [Create veth interface lsr101] ******************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Tuesday 24 September 2024 14:58:43 -0400 (0:00:00.769) 0:00:29.742 ***** 49915 1727204323.03627: entering _queue_task() for managed-node2/command 49915 1727204323.04108: worker is 1 (out of 1 available) 49915 1727204323.04117: exiting _queue_task() for managed-node2/command 49915 1727204323.04128: done queuing things up, now waiting for results queue to drain 49915 1727204323.04129: waiting for pending results... 49915 1727204323.04366: running TaskExecutor() for managed-node2/TASK: Create veth interface lsr101 49915 1727204323.04400: in run() - task 028d2410-947f-dcd7-b5af-000000000940 49915 1727204323.04464: variable 'ansible_search_path' from source: unknown 49915 1727204323.04468: variable 'ansible_search_path' from source: unknown 49915 1727204323.04702: variable 'interface' from source: play vars 49915 1727204323.04793: variable 'interface' from source: play vars 49915 1727204323.04867: variable 'interface' from source: play vars 49915 1727204323.05046: Loaded config def from plugin (lookup/items) 49915 1727204323.05114: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 49915 1727204323.05120: variable 'omit' from source: magic vars 49915 1727204323.05273: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204323.05291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204323.05306: variable 'omit' from source: magic vars 49915 1727204323.05784: variable 'ansible_distribution_major_version' from source: facts 49915 1727204323.05788: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204323.06245: variable 'type' from source: play vars 49915 1727204323.06337: variable 'state' from source: include params 49915 1727204323.06354: variable 'interface' from source: play vars 49915 1727204323.06363: variable 'current_interfaces' from source: set_fact 49915 1727204323.06374: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 49915 1727204323.06387: when evaluation is False, skipping this task 49915 1727204323.06421: variable 'item' from source: unknown 49915 1727204323.06708: variable 'item' from source: unknown skipping: [managed-node2] => (item=ip link add lsr101 type veth peer name peerlsr101) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link add lsr101 type veth peer name peerlsr101", "skip_reason": "Conditional result was False" } 49915 1727204323.07018: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204323.07028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204323.07149: variable 'omit' from source: magic vars 49915 1727204323.07275: variable 'ansible_distribution_major_version' from source: facts 49915 1727204323.07288: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204323.07473: variable 'type' from source: play vars 49915 1727204323.07484: variable 'state' from source: include params 49915 1727204323.07494: variable 'interface' from source: play vars 49915 1727204323.07502: variable 'current_interfaces' from source: set_fact 49915 1727204323.07516: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 49915 1727204323.07524: when evaluation is False, skipping this task 49915 1727204323.07557: variable 'item' from source: unknown 49915 1727204323.07623: variable 'item' from source: unknown skipping: [managed-node2] => (item=ip link set peerlsr101 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set peerlsr101 up", "skip_reason": "Conditional result was False" } 49915 1727204323.07880: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204323.07885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204323.07888: variable 'omit' from source: magic vars 49915 1727204323.07925: variable 'ansible_distribution_major_version' from source: facts 49915 1727204323.07935: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204323.08132: variable 'type' from source: play vars 49915 1727204323.08142: variable 'state' from source: include params 49915 1727204323.08153: variable 'interface' from source: play vars 49915 1727204323.08222: variable 'current_interfaces' from source: set_fact 49915 1727204323.08225: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 49915 1727204323.08228: when evaluation is False, skipping this task 49915 1727204323.08230: variable 'item' from source: unknown 49915 1727204323.08280: variable 'item' from source: unknown skipping: [managed-node2] => (item=ip link set lsr101 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set lsr101 up", "skip_reason": "Conditional result was False" } 49915 1727204323.08582: dumping result to json 49915 1727204323.08585: done dumping result, returning 49915 1727204323.08587: done running TaskExecutor() for managed-node2/TASK: Create veth interface lsr101 [028d2410-947f-dcd7-b5af-000000000940] 49915 1727204323.08589: sending task result for task 028d2410-947f-dcd7-b5af-000000000940 49915 1727204323.08631: done sending task result for task 028d2410-947f-dcd7-b5af-000000000940 49915 1727204323.08634: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false } MSG: All items skipped 49915 1727204323.08672: no more pending results, returning what we have 49915 1727204323.08678: results queue empty 49915 1727204323.08679: checking for any_errors_fatal 49915 1727204323.08688: done checking for any_errors_fatal 49915 1727204323.08689: checking for max_fail_percentage 49915 1727204323.08690: done checking for max_fail_percentage 49915 1727204323.08691: checking to see if all hosts have failed and the running result is not ok 49915 1727204323.08692: done checking to see if all hosts have failed 49915 1727204323.08693: getting the remaining hosts for this loop 49915 1727204323.08695: done getting the remaining hosts for this loop 49915 1727204323.08698: getting the next task for host managed-node2 49915 1727204323.08705: done getting next task for host managed-node2 49915 1727204323.08708: ^ task is: TASK: Set up veth as managed by NetworkManager 49915 1727204323.08711: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204323.08718: getting variables 49915 1727204323.08720: in VariableManager get_vars() 49915 1727204323.08761: Calling all_inventory to load vars for managed-node2 49915 1727204323.08764: Calling groups_inventory to load vars for managed-node2 49915 1727204323.08766: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204323.08779: Calling all_plugins_play to load vars for managed-node2 49915 1727204323.08782: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204323.08785: Calling groups_plugins_play to load vars for managed-node2 49915 1727204323.10616: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204323.12230: done with get_vars() 49915 1727204323.12250: done getting variables 49915 1727204323.12306: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Tuesday 24 September 2024 14:58:43 -0400 (0:00:00.087) 0:00:29.829 ***** 49915 1727204323.12338: entering _queue_task() for managed-node2/command 49915 1727204323.12677: worker is 1 (out of 1 available) 49915 1727204323.12691: exiting _queue_task() for managed-node2/command 49915 1727204323.12706: done queuing things up, now waiting for results queue to drain 49915 1727204323.12707: waiting for pending results... 49915 1727204323.13073: running TaskExecutor() for managed-node2/TASK: Set up veth as managed by NetworkManager 49915 1727204323.13216: in run() - task 028d2410-947f-dcd7-b5af-000000000941 49915 1727204323.13242: variable 'ansible_search_path' from source: unknown 49915 1727204323.13251: variable 'ansible_search_path' from source: unknown 49915 1727204323.13296: calling self._execute() 49915 1727204323.13402: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204323.13457: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204323.13461: variable 'omit' from source: magic vars 49915 1727204323.13994: variable 'ansible_distribution_major_version' from source: facts 49915 1727204323.13998: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204323.14206: variable 'type' from source: play vars 49915 1727204323.14224: variable 'state' from source: include params 49915 1727204323.14236: Evaluated conditional (type == 'veth' and state == 'present'): False 49915 1727204323.14281: when evaluation is False, skipping this task 49915 1727204323.14284: _execute() done 49915 1727204323.14286: dumping result to json 49915 1727204323.14288: done dumping result, returning 49915 1727204323.14290: done running TaskExecutor() for managed-node2/TASK: Set up veth as managed by NetworkManager [028d2410-947f-dcd7-b5af-000000000941] 49915 1727204323.14292: sending task result for task 028d2410-947f-dcd7-b5af-000000000941 49915 1727204323.14481: done sending task result for task 028d2410-947f-dcd7-b5af-000000000941 49915 1727204323.14485: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'veth' and state == 'present'", "skip_reason": "Conditional result was False" } 49915 1727204323.14534: no more pending results, returning what we have 49915 1727204323.14537: results queue empty 49915 1727204323.14538: checking for any_errors_fatal 49915 1727204323.14551: done checking for any_errors_fatal 49915 1727204323.14552: checking for max_fail_percentage 49915 1727204323.14554: done checking for max_fail_percentage 49915 1727204323.14555: checking to see if all hosts have failed and the running result is not ok 49915 1727204323.14559: done checking to see if all hosts have failed 49915 1727204323.14559: getting the remaining hosts for this loop 49915 1727204323.14561: done getting the remaining hosts for this loop 49915 1727204323.14566: getting the next task for host managed-node2 49915 1727204323.14574: done getting next task for host managed-node2 49915 1727204323.14579: ^ task is: TASK: Delete veth interface {{ interface }} 49915 1727204323.14582: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204323.14587: getting variables 49915 1727204323.14589: in VariableManager get_vars() 49915 1727204323.14643: Calling all_inventory to load vars for managed-node2 49915 1727204323.14646: Calling groups_inventory to load vars for managed-node2 49915 1727204323.14649: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204323.14662: Calling all_plugins_play to load vars for managed-node2 49915 1727204323.14665: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204323.14669: Calling groups_plugins_play to load vars for managed-node2 49915 1727204323.16981: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204323.19325: done with get_vars() 49915 1727204323.19364: done getting variables 49915 1727204323.19433: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49915 1727204323.19580: variable 'interface' from source: play vars TASK [Delete veth interface lsr101] ******************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Tuesday 24 September 2024 14:58:43 -0400 (0:00:00.073) 0:00:29.903 ***** 49915 1727204323.19687: entering _queue_task() for managed-node2/command 49915 1727204323.20060: worker is 1 (out of 1 available) 49915 1727204323.20073: exiting _queue_task() for managed-node2/command 49915 1727204323.20286: done queuing things up, now waiting for results queue to drain 49915 1727204323.20288: waiting for pending results... 49915 1727204323.20421: running TaskExecutor() for managed-node2/TASK: Delete veth interface lsr101 49915 1727204323.20503: in run() - task 028d2410-947f-dcd7-b5af-000000000942 49915 1727204323.20581: variable 'ansible_search_path' from source: unknown 49915 1727204323.20584: variable 'ansible_search_path' from source: unknown 49915 1727204323.20588: calling self._execute() 49915 1727204323.20678: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204323.20690: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204323.20703: variable 'omit' from source: magic vars 49915 1727204323.21206: variable 'ansible_distribution_major_version' from source: facts 49915 1727204323.21236: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204323.21451: variable 'type' from source: play vars 49915 1727204323.21461: variable 'state' from source: include params 49915 1727204323.21493: variable 'interface' from source: play vars 49915 1727204323.21496: variable 'current_interfaces' from source: set_fact 49915 1727204323.21499: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): True 49915 1727204323.21509: variable 'omit' from source: magic vars 49915 1727204323.21602: variable 'omit' from source: magic vars 49915 1727204323.21660: variable 'interface' from source: play vars 49915 1727204323.21684: variable 'omit' from source: magic vars 49915 1727204323.21737: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204323.21791: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204323.21823: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204323.21845: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204323.21860: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204323.21897: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204323.21929: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204323.21932: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204323.22019: Set connection var ansible_connection to ssh 49915 1727204323.22028: Set connection var ansible_shell_type to sh 49915 1727204323.22083: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204323.22086: Set connection var ansible_shell_executable to /bin/sh 49915 1727204323.22088: Set connection var ansible_timeout to 10 49915 1727204323.22090: Set connection var ansible_pipelining to False 49915 1727204323.22102: variable 'ansible_shell_executable' from source: unknown 49915 1727204323.22108: variable 'ansible_connection' from source: unknown 49915 1727204323.22117: variable 'ansible_module_compression' from source: unknown 49915 1727204323.22123: variable 'ansible_shell_type' from source: unknown 49915 1727204323.22130: variable 'ansible_shell_executable' from source: unknown 49915 1727204323.22136: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204323.22147: variable 'ansible_pipelining' from source: unknown 49915 1727204323.22153: variable 'ansible_timeout' from source: unknown 49915 1727204323.22160: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204323.22311: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204323.22362: variable 'omit' from source: magic vars 49915 1727204323.22365: starting attempt loop 49915 1727204323.22367: running the handler 49915 1727204323.22369: _low_level_execute_command(): starting 49915 1727204323.22372: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49915 1727204323.23193: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204323.23199: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204323.23259: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204323.23291: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204323.23331: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204323.23410: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204323.25128: stdout chunk (state=3): >>>/root <<< 49915 1727204323.25226: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204323.25281: stderr chunk (state=3): >>><<< 49915 1727204323.25286: stdout chunk (state=3): >>><<< 49915 1727204323.25294: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204323.25309: _low_level_execute_command(): starting 49915 1727204323.25430: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204323.2529337-51907-253065559691187 `" && echo ansible-tmp-1727204323.2529337-51907-253065559691187="` echo /root/.ansible/tmp/ansible-tmp-1727204323.2529337-51907-253065559691187 `" ) && sleep 0' 49915 1727204323.25925: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204323.25935: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204323.25947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204323.25960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204323.25973: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 49915 1727204323.25994: stderr chunk (state=3): >>>debug2: match not found <<< 49915 1727204323.25997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204323.26008: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 49915 1727204323.26020: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address <<< 49915 1727204323.26104: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204323.26124: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204323.26223: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204323.28218: stdout chunk (state=3): >>>ansible-tmp-1727204323.2529337-51907-253065559691187=/root/.ansible/tmp/ansible-tmp-1727204323.2529337-51907-253065559691187 <<< 49915 1727204323.28363: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204323.28377: stderr chunk (state=3): >>><<< 49915 1727204323.28387: stdout chunk (state=3): >>><<< 49915 1727204323.28413: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204323.2529337-51907-253065559691187=/root/.ansible/tmp/ansible-tmp-1727204323.2529337-51907-253065559691187 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204323.28522: variable 'ansible_module_compression' from source: unknown 49915 1727204323.28526: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49915ogiz3nec/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 49915 1727204323.28565: variable 'ansible_facts' from source: unknown 49915 1727204323.28980: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204323.2529337-51907-253065559691187/AnsiballZ_command.py 49915 1727204323.29096: Sending initial data 49915 1727204323.29099: Sent initial data (156 bytes) 49915 1727204323.29791: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204323.29799: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204323.29812: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204323.29832: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204323.29926: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204323.31580: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 49915 1727204323.31598: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 49915 1727204323.31627: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49915 1727204323.31727: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49915 1727204323.31789: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49915ogiz3nec/tmphdjd0kdd /root/.ansible/tmp/ansible-tmp-1727204323.2529337-51907-253065559691187/AnsiballZ_command.py <<< 49915 1727204323.31812: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204323.2529337-51907-253065559691187/AnsiballZ_command.py" <<< 49915 1727204323.31892: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49915ogiz3nec/tmphdjd0kdd" to remote "/root/.ansible/tmp/ansible-tmp-1727204323.2529337-51907-253065559691187/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204323.2529337-51907-253065559691187/AnsiballZ_command.py" <<< 49915 1727204323.32841: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204323.32844: stderr chunk (state=3): >>><<< 49915 1727204323.32859: stdout chunk (state=3): >>><<< 49915 1727204323.33012: done transferring module to remote 49915 1727204323.33016: _low_level_execute_command(): starting 49915 1727204323.33018: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204323.2529337-51907-253065559691187/ /root/.ansible/tmp/ansible-tmp-1727204323.2529337-51907-253065559691187/AnsiballZ_command.py && sleep 0' 49915 1727204323.33574: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204323.33683: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204323.33687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204323.33702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204323.33739: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204323.33754: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204323.33769: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204323.33862: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204323.35731: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204323.35735: stderr chunk (state=3): >>><<< 49915 1727204323.35737: stdout chunk (state=3): >>><<< 49915 1727204323.35753: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204323.35761: _low_level_execute_command(): starting 49915 1727204323.35771: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204323.2529337-51907-253065559691187/AnsiballZ_command.py && sleep 0' 49915 1727204323.36399: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204323.36414: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204323.36429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204323.36448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204323.36552: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204323.36612: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204323.36698: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204323.52729: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "lsr101", "type", "veth"], "start": "2024-09-24 14:58:43.515093", "end": "2024-09-24 14:58:43.521295", "delta": "0:00:00.006202", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del lsr101 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 49915 1727204323.54108: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204323.54128: stderr chunk (state=3): >>>Shared connection to 10.31.13.254 closed. <<< 49915 1727204323.54183: stderr chunk (state=3): >>><<< 49915 1727204323.54454: stdout chunk (state=3): >>><<< 49915 1727204323.54458: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "lsr101", "type", "veth"], "start": "2024-09-24 14:58:43.515093", "end": "2024-09-24 14:58:43.521295", "delta": "0:00:00.006202", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del lsr101 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 49915 1727204323.54462: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del lsr101 type veth', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204323.2529337-51907-253065559691187/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49915 1727204323.54465: _low_level_execute_command(): starting 49915 1727204323.54467: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204323.2529337-51907-253065559691187/ > /dev/null 2>&1 && sleep 0' 49915 1727204323.55874: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204323.55937: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204323.55969: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204323.56310: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204323.56512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204323.58733: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204323.58736: stdout chunk (state=3): >>><<< 49915 1727204323.58744: stderr chunk (state=3): >>><<< 49915 1727204323.58760: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204323.58766: handler run complete 49915 1727204323.58792: Evaluated conditional (False): False 49915 1727204323.58803: attempt loop complete, returning result 49915 1727204323.58805: _execute() done 49915 1727204323.58808: dumping result to json 49915 1727204323.58814: done dumping result, returning 49915 1727204323.58825: done running TaskExecutor() for managed-node2/TASK: Delete veth interface lsr101 [028d2410-947f-dcd7-b5af-000000000942] 49915 1727204323.58830: sending task result for task 028d2410-947f-dcd7-b5af-000000000942 ok: [managed-node2] => { "changed": false, "cmd": [ "ip", "link", "del", "lsr101", "type", "veth" ], "delta": "0:00:00.006202", "end": "2024-09-24 14:58:43.521295", "rc": 0, "start": "2024-09-24 14:58:43.515093" } 49915 1727204323.59085: no more pending results, returning what we have 49915 1727204323.59088: results queue empty 49915 1727204323.59090: checking for any_errors_fatal 49915 1727204323.59094: done checking for any_errors_fatal 49915 1727204323.59095: checking for max_fail_percentage 49915 1727204323.59096: done checking for max_fail_percentage 49915 1727204323.59097: checking to see if all hosts have failed and the running result is not ok 49915 1727204323.59098: done checking to see if all hosts have failed 49915 1727204323.59099: getting the remaining hosts for this loop 49915 1727204323.59100: done getting the remaining hosts for this loop 49915 1727204323.59104: getting the next task for host managed-node2 49915 1727204323.59112: done getting next task for host managed-node2 49915 1727204323.59115: ^ task is: TASK: Create dummy interface {{ interface }} 49915 1727204323.59117: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204323.59122: getting variables 49915 1727204323.59124: in VariableManager get_vars() 49915 1727204323.59165: Calling all_inventory to load vars for managed-node2 49915 1727204323.59167: Calling groups_inventory to load vars for managed-node2 49915 1727204323.59169: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204323.59184: Calling all_plugins_play to load vars for managed-node2 49915 1727204323.59187: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204323.59191: done sending task result for task 028d2410-947f-dcd7-b5af-000000000942 49915 1727204323.59194: WORKER PROCESS EXITING 49915 1727204323.59197: Calling groups_plugins_play to load vars for managed-node2 49915 1727204323.61694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204323.64037: done with get_vars() 49915 1727204323.64066: done getting variables 49915 1727204323.64139: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49915 1727204323.64267: variable 'interface' from source: play vars TASK [Create dummy interface lsr101] ******************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Tuesday 24 September 2024 14:58:43 -0400 (0:00:00.446) 0:00:30.349 ***** 49915 1727204323.64306: entering _queue_task() for managed-node2/command 49915 1727204323.64879: worker is 1 (out of 1 available) 49915 1727204323.64890: exiting _queue_task() for managed-node2/command 49915 1727204323.64900: done queuing things up, now waiting for results queue to drain 49915 1727204323.64902: waiting for pending results... 49915 1727204323.65010: running TaskExecutor() for managed-node2/TASK: Create dummy interface lsr101 49915 1727204323.65141: in run() - task 028d2410-947f-dcd7-b5af-000000000943 49915 1727204323.65162: variable 'ansible_search_path' from source: unknown 49915 1727204323.65169: variable 'ansible_search_path' from source: unknown 49915 1727204323.65210: calling self._execute() 49915 1727204323.65309: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204323.65321: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204323.65335: variable 'omit' from source: magic vars 49915 1727204323.65728: variable 'ansible_distribution_major_version' from source: facts 49915 1727204323.65744: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204323.65959: variable 'type' from source: play vars 49915 1727204323.65969: variable 'state' from source: include params 49915 1727204323.65981: variable 'interface' from source: play vars 49915 1727204323.65997: variable 'current_interfaces' from source: set_fact 49915 1727204323.66109: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 49915 1727204323.66112: when evaluation is False, skipping this task 49915 1727204323.66114: _execute() done 49915 1727204323.66117: dumping result to json 49915 1727204323.66119: done dumping result, returning 49915 1727204323.66121: done running TaskExecutor() for managed-node2/TASK: Create dummy interface lsr101 [028d2410-947f-dcd7-b5af-000000000943] 49915 1727204323.66123: sending task result for task 028d2410-947f-dcd7-b5af-000000000943 49915 1727204323.66180: done sending task result for task 028d2410-947f-dcd7-b5af-000000000943 49915 1727204323.66183: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 49915 1727204323.66228: no more pending results, returning what we have 49915 1727204323.66232: results queue empty 49915 1727204323.66233: checking for any_errors_fatal 49915 1727204323.66242: done checking for any_errors_fatal 49915 1727204323.66242: checking for max_fail_percentage 49915 1727204323.66244: done checking for max_fail_percentage 49915 1727204323.66245: checking to see if all hosts have failed and the running result is not ok 49915 1727204323.66246: done checking to see if all hosts have failed 49915 1727204323.66247: getting the remaining hosts for this loop 49915 1727204323.66248: done getting the remaining hosts for this loop 49915 1727204323.66252: getting the next task for host managed-node2 49915 1727204323.66259: done getting next task for host managed-node2 49915 1727204323.66262: ^ task is: TASK: Delete dummy interface {{ interface }} 49915 1727204323.66265: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204323.66270: getting variables 49915 1727204323.66272: in VariableManager get_vars() 49915 1727204323.66318: Calling all_inventory to load vars for managed-node2 49915 1727204323.66321: Calling groups_inventory to load vars for managed-node2 49915 1727204323.66323: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204323.66335: Calling all_plugins_play to load vars for managed-node2 49915 1727204323.66338: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204323.66340: Calling groups_plugins_play to load vars for managed-node2 49915 1727204323.68568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204323.70441: done with get_vars() 49915 1727204323.70470: done getting variables 49915 1727204323.70533: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49915 1727204323.70639: variable 'interface' from source: play vars TASK [Delete dummy interface lsr101] ******************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Tuesday 24 September 2024 14:58:43 -0400 (0:00:00.063) 0:00:30.412 ***** 49915 1727204323.70667: entering _queue_task() for managed-node2/command 49915 1727204323.71234: worker is 1 (out of 1 available) 49915 1727204323.71249: exiting _queue_task() for managed-node2/command 49915 1727204323.71260: done queuing things up, now waiting for results queue to drain 49915 1727204323.71262: waiting for pending results... 49915 1727204323.71790: running TaskExecutor() for managed-node2/TASK: Delete dummy interface lsr101 49915 1727204323.71990: in run() - task 028d2410-947f-dcd7-b5af-000000000944 49915 1727204323.72097: variable 'ansible_search_path' from source: unknown 49915 1727204323.72103: variable 'ansible_search_path' from source: unknown 49915 1727204323.72106: calling self._execute() 49915 1727204323.72313: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204323.72332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204323.72421: variable 'omit' from source: magic vars 49915 1727204323.72946: variable 'ansible_distribution_major_version' from source: facts 49915 1727204323.72964: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204323.73191: variable 'type' from source: play vars 49915 1727204323.73200: variable 'state' from source: include params 49915 1727204323.73212: variable 'interface' from source: play vars 49915 1727204323.73219: variable 'current_interfaces' from source: set_fact 49915 1727204323.73231: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 49915 1727204323.73237: when evaluation is False, skipping this task 49915 1727204323.73242: _execute() done 49915 1727204323.73247: dumping result to json 49915 1727204323.73253: done dumping result, returning 49915 1727204323.73290: done running TaskExecutor() for managed-node2/TASK: Delete dummy interface lsr101 [028d2410-947f-dcd7-b5af-000000000944] 49915 1727204323.73292: sending task result for task 028d2410-947f-dcd7-b5af-000000000944 skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 49915 1727204323.73444: no more pending results, returning what we have 49915 1727204323.73447: results queue empty 49915 1727204323.73448: checking for any_errors_fatal 49915 1727204323.73455: done checking for any_errors_fatal 49915 1727204323.73455: checking for max_fail_percentage 49915 1727204323.73457: done checking for max_fail_percentage 49915 1727204323.73458: checking to see if all hosts have failed and the running result is not ok 49915 1727204323.73459: done checking to see if all hosts have failed 49915 1727204323.73460: getting the remaining hosts for this loop 49915 1727204323.73462: done getting the remaining hosts for this loop 49915 1727204323.73466: getting the next task for host managed-node2 49915 1727204323.73474: done getting next task for host managed-node2 49915 1727204323.73478: ^ task is: TASK: Create tap interface {{ interface }} 49915 1727204323.73482: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204323.73487: getting variables 49915 1727204323.73488: in VariableManager get_vars() 49915 1727204323.73536: Calling all_inventory to load vars for managed-node2 49915 1727204323.73539: Calling groups_inventory to load vars for managed-node2 49915 1727204323.73542: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204323.73556: Calling all_plugins_play to load vars for managed-node2 49915 1727204323.73559: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204323.73562: Calling groups_plugins_play to load vars for managed-node2 49915 1727204323.74089: done sending task result for task 028d2410-947f-dcd7-b5af-000000000944 49915 1727204323.74092: WORKER PROCESS EXITING 49915 1727204323.75261: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204323.78165: done with get_vars() 49915 1727204323.78198: done getting variables 49915 1727204323.78263: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49915 1727204323.78387: variable 'interface' from source: play vars TASK [Create tap interface lsr101] ********************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Tuesday 24 September 2024 14:58:43 -0400 (0:00:00.077) 0:00:30.490 ***** 49915 1727204323.78424: entering _queue_task() for managed-node2/command 49915 1727204323.78784: worker is 1 (out of 1 available) 49915 1727204323.78797: exiting _queue_task() for managed-node2/command 49915 1727204323.78810: done queuing things up, now waiting for results queue to drain 49915 1727204323.78814: waiting for pending results... 49915 1727204323.79162: running TaskExecutor() for managed-node2/TASK: Create tap interface lsr101 49915 1727204323.79235: in run() - task 028d2410-947f-dcd7-b5af-000000000945 49915 1727204323.79261: variable 'ansible_search_path' from source: unknown 49915 1727204323.79269: variable 'ansible_search_path' from source: unknown 49915 1727204323.79317: calling self._execute() 49915 1727204323.79480: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204323.79483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204323.79495: variable 'omit' from source: magic vars 49915 1727204323.79888: variable 'ansible_distribution_major_version' from source: facts 49915 1727204323.79917: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204323.80123: variable 'type' from source: play vars 49915 1727204323.80135: variable 'state' from source: include params 49915 1727204323.80143: variable 'interface' from source: play vars 49915 1727204323.80150: variable 'current_interfaces' from source: set_fact 49915 1727204323.80239: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 49915 1727204323.80242: when evaluation is False, skipping this task 49915 1727204323.80244: _execute() done 49915 1727204323.80246: dumping result to json 49915 1727204323.80248: done dumping result, returning 49915 1727204323.80250: done running TaskExecutor() for managed-node2/TASK: Create tap interface lsr101 [028d2410-947f-dcd7-b5af-000000000945] 49915 1727204323.80252: sending task result for task 028d2410-947f-dcd7-b5af-000000000945 49915 1727204323.80315: done sending task result for task 028d2410-947f-dcd7-b5af-000000000945 49915 1727204323.80318: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 49915 1727204323.80391: no more pending results, returning what we have 49915 1727204323.80395: results queue empty 49915 1727204323.80396: checking for any_errors_fatal 49915 1727204323.80406: done checking for any_errors_fatal 49915 1727204323.80407: checking for max_fail_percentage 49915 1727204323.80409: done checking for max_fail_percentage 49915 1727204323.80410: checking to see if all hosts have failed and the running result is not ok 49915 1727204323.80411: done checking to see if all hosts have failed 49915 1727204323.80412: getting the remaining hosts for this loop 49915 1727204323.80413: done getting the remaining hosts for this loop 49915 1727204323.80417: getting the next task for host managed-node2 49915 1727204323.80425: done getting next task for host managed-node2 49915 1727204323.80428: ^ task is: TASK: Delete tap interface {{ interface }} 49915 1727204323.80431: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204323.80435: getting variables 49915 1727204323.80437: in VariableManager get_vars() 49915 1727204323.80486: Calling all_inventory to load vars for managed-node2 49915 1727204323.80489: Calling groups_inventory to load vars for managed-node2 49915 1727204323.80491: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204323.80505: Calling all_plugins_play to load vars for managed-node2 49915 1727204323.80507: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204323.80510: Calling groups_plugins_play to load vars for managed-node2 49915 1727204323.83299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204323.84829: done with get_vars() 49915 1727204323.84854: done getting variables 49915 1727204323.84914: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 49915 1727204323.85043: variable 'interface' from source: play vars TASK [Delete tap interface lsr101] ********************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Tuesday 24 September 2024 14:58:43 -0400 (0:00:00.066) 0:00:30.556 ***** 49915 1727204323.85073: entering _queue_task() for managed-node2/command 49915 1727204323.85658: worker is 1 (out of 1 available) 49915 1727204323.85672: exiting _queue_task() for managed-node2/command 49915 1727204323.85692: done queuing things up, now waiting for results queue to drain 49915 1727204323.85694: waiting for pending results... 49915 1727204323.86208: running TaskExecutor() for managed-node2/TASK: Delete tap interface lsr101 49915 1727204323.86347: in run() - task 028d2410-947f-dcd7-b5af-000000000946 49915 1727204323.86408: variable 'ansible_search_path' from source: unknown 49915 1727204323.86412: variable 'ansible_search_path' from source: unknown 49915 1727204323.86415: calling self._execute() 49915 1727204323.86513: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204323.86565: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204323.86569: variable 'omit' from source: magic vars 49915 1727204323.86937: variable 'ansible_distribution_major_version' from source: facts 49915 1727204323.86958: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204323.87174: variable 'type' from source: play vars 49915 1727204323.87187: variable 'state' from source: include params 49915 1727204323.87196: variable 'interface' from source: play vars 49915 1727204323.87214: variable 'current_interfaces' from source: set_fact 49915 1727204323.87220: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 49915 1727204323.87280: when evaluation is False, skipping this task 49915 1727204323.87283: _execute() done 49915 1727204323.87286: dumping result to json 49915 1727204323.87288: done dumping result, returning 49915 1727204323.87290: done running TaskExecutor() for managed-node2/TASK: Delete tap interface lsr101 [028d2410-947f-dcd7-b5af-000000000946] 49915 1727204323.87292: sending task result for task 028d2410-947f-dcd7-b5af-000000000946 49915 1727204323.87362: done sending task result for task 028d2410-947f-dcd7-b5af-000000000946 49915 1727204323.87365: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 49915 1727204323.87548: no more pending results, returning what we have 49915 1727204323.87552: results queue empty 49915 1727204323.87553: checking for any_errors_fatal 49915 1727204323.87559: done checking for any_errors_fatal 49915 1727204323.87560: checking for max_fail_percentage 49915 1727204323.87562: done checking for max_fail_percentage 49915 1727204323.87563: checking to see if all hosts have failed and the running result is not ok 49915 1727204323.87564: done checking to see if all hosts have failed 49915 1727204323.87565: getting the remaining hosts for this loop 49915 1727204323.87566: done getting the remaining hosts for this loop 49915 1727204323.87570: getting the next task for host managed-node2 49915 1727204323.87583: done getting next task for host managed-node2 49915 1727204323.87587: ^ task is: TASK: Verify network state restored to default 49915 1727204323.87590: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204323.87594: getting variables 49915 1727204323.87596: in VariableManager get_vars() 49915 1727204323.87756: Calling all_inventory to load vars for managed-node2 49915 1727204323.87760: Calling groups_inventory to load vars for managed-node2 49915 1727204323.87763: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204323.87773: Calling all_plugins_play to load vars for managed-node2 49915 1727204323.87778: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204323.87781: Calling groups_plugins_play to load vars for managed-node2 49915 1727204323.89253: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204323.90990: done with get_vars() 49915 1727204323.91016: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:77 Tuesday 24 September 2024 14:58:43 -0400 (0:00:00.060) 0:00:30.617 ***** 49915 1727204323.91112: entering _queue_task() for managed-node2/include_tasks 49915 1727204323.91563: worker is 1 (out of 1 available) 49915 1727204323.91574: exiting _queue_task() for managed-node2/include_tasks 49915 1727204323.91588: done queuing things up, now waiting for results queue to drain 49915 1727204323.91589: waiting for pending results... 49915 1727204323.91823: running TaskExecutor() for managed-node2/TASK: Verify network state restored to default 49915 1727204323.91858: in run() - task 028d2410-947f-dcd7-b5af-0000000000ab 49915 1727204323.91920: variable 'ansible_search_path' from source: unknown 49915 1727204323.91924: calling self._execute() 49915 1727204323.92014: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204323.92030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204323.92044: variable 'omit' from source: magic vars 49915 1727204323.92448: variable 'ansible_distribution_major_version' from source: facts 49915 1727204323.92471: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204323.92534: _execute() done 49915 1727204323.92538: dumping result to json 49915 1727204323.92541: done dumping result, returning 49915 1727204323.92543: done running TaskExecutor() for managed-node2/TASK: Verify network state restored to default [028d2410-947f-dcd7-b5af-0000000000ab] 49915 1727204323.92545: sending task result for task 028d2410-947f-dcd7-b5af-0000000000ab 49915 1727204323.92687: no more pending results, returning what we have 49915 1727204323.92693: in VariableManager get_vars() 49915 1727204323.92749: Calling all_inventory to load vars for managed-node2 49915 1727204323.92753: Calling groups_inventory to load vars for managed-node2 49915 1727204323.92755: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204323.92769: Calling all_plugins_play to load vars for managed-node2 49915 1727204323.92772: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204323.92778: Calling groups_plugins_play to load vars for managed-node2 49915 1727204323.93491: done sending task result for task 028d2410-947f-dcd7-b5af-0000000000ab 49915 1727204323.93494: WORKER PROCESS EXITING 49915 1727204323.94340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204323.95915: done with get_vars() 49915 1727204323.95935: variable 'ansible_search_path' from source: unknown 49915 1727204323.95950: we have included files to process 49915 1727204323.95951: generating all_blocks data 49915 1727204323.95953: done generating all_blocks data 49915 1727204323.95958: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 49915 1727204323.95959: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 49915 1727204323.95962: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 49915 1727204323.96382: done processing included file 49915 1727204323.96384: iterating over new_blocks loaded from include file 49915 1727204323.96386: in VariableManager get_vars() 49915 1727204323.96404: done with get_vars() 49915 1727204323.96405: filtering new block on tags 49915 1727204323.96423: done filtering new block on tags 49915 1727204323.96425: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed-node2 49915 1727204323.96431: extending task lists for all hosts with included blocks 49915 1727204323.99845: done extending task lists 49915 1727204323.99846: done processing included files 49915 1727204323.99847: results queue empty 49915 1727204323.99848: checking for any_errors_fatal 49915 1727204323.99852: done checking for any_errors_fatal 49915 1727204323.99853: checking for max_fail_percentage 49915 1727204323.99854: done checking for max_fail_percentage 49915 1727204323.99855: checking to see if all hosts have failed and the running result is not ok 49915 1727204323.99856: done checking to see if all hosts have failed 49915 1727204323.99857: getting the remaining hosts for this loop 49915 1727204323.99858: done getting the remaining hosts for this loop 49915 1727204323.99860: getting the next task for host managed-node2 49915 1727204323.99864: done getting next task for host managed-node2 49915 1727204323.99867: ^ task is: TASK: Check routes and DNS 49915 1727204323.99869: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204323.99872: getting variables 49915 1727204323.99872: in VariableManager get_vars() 49915 1727204323.99890: Calling all_inventory to load vars for managed-node2 49915 1727204323.99893: Calling groups_inventory to load vars for managed-node2 49915 1727204323.99895: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204323.99902: Calling all_plugins_play to load vars for managed-node2 49915 1727204323.99904: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204323.99908: Calling groups_plugins_play to load vars for managed-node2 49915 1727204324.01014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204324.01855: done with get_vars() 49915 1727204324.01870: done getting variables 49915 1727204324.01904: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Tuesday 24 September 2024 14:58:44 -0400 (0:00:00.108) 0:00:30.725 ***** 49915 1727204324.01926: entering _queue_task() for managed-node2/shell 49915 1727204324.02186: worker is 1 (out of 1 available) 49915 1727204324.02199: exiting _queue_task() for managed-node2/shell 49915 1727204324.02212: done queuing things up, now waiting for results queue to drain 49915 1727204324.02214: waiting for pending results... 49915 1727204324.02394: running TaskExecutor() for managed-node2/TASK: Check routes and DNS 49915 1727204324.02479: in run() - task 028d2410-947f-dcd7-b5af-000000000b17 49915 1727204324.02496: variable 'ansible_search_path' from source: unknown 49915 1727204324.02500: variable 'ansible_search_path' from source: unknown 49915 1727204324.02586: calling self._execute() 49915 1727204324.02653: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204324.02666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204324.02682: variable 'omit' from source: magic vars 49915 1727204324.03092: variable 'ansible_distribution_major_version' from source: facts 49915 1727204324.03134: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204324.03137: variable 'omit' from source: magic vars 49915 1727204324.03381: variable 'omit' from source: magic vars 49915 1727204324.03384: variable 'omit' from source: magic vars 49915 1727204324.03387: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204324.03390: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204324.03393: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204324.03395: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204324.03398: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204324.03400: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204324.03402: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204324.03404: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204324.03494: Set connection var ansible_connection to ssh 49915 1727204324.03504: Set connection var ansible_shell_type to sh 49915 1727204324.03526: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204324.03552: Set connection var ansible_shell_executable to /bin/sh 49915 1727204324.03595: Set connection var ansible_timeout to 10 49915 1727204324.03597: Set connection var ansible_pipelining to False 49915 1727204324.03600: variable 'ansible_shell_executable' from source: unknown 49915 1727204324.03602: variable 'ansible_connection' from source: unknown 49915 1727204324.03605: variable 'ansible_module_compression' from source: unknown 49915 1727204324.03607: variable 'ansible_shell_type' from source: unknown 49915 1727204324.03610: variable 'ansible_shell_executable' from source: unknown 49915 1727204324.03614: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204324.03617: variable 'ansible_pipelining' from source: unknown 49915 1727204324.03619: variable 'ansible_timeout' from source: unknown 49915 1727204324.03622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204324.03728: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204324.03742: variable 'omit' from source: magic vars 49915 1727204324.03745: starting attempt loop 49915 1727204324.03747: running the handler 49915 1727204324.03753: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204324.03770: _low_level_execute_command(): starting 49915 1727204324.03778: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49915 1727204324.04424: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204324.04441: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204324.04460: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204324.04568: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204324.06279: stdout chunk (state=3): >>>/root <<< 49915 1727204324.06408: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204324.06421: stdout chunk (state=3): >>><<< 49915 1727204324.06439: stderr chunk (state=3): >>><<< 49915 1727204324.06464: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204324.06495: _low_level_execute_command(): starting 49915 1727204324.06506: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204324.064803-51945-144955469921306 `" && echo ansible-tmp-1727204324.064803-51945-144955469921306="` echo /root/.ansible/tmp/ansible-tmp-1727204324.064803-51945-144955469921306 `" ) && sleep 0' 49915 1727204324.07090: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204324.07095: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration <<< 49915 1727204324.07106: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 49915 1727204324.07109: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204324.07125: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204324.07211: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204324.09112: stdout chunk (state=3): >>>ansible-tmp-1727204324.064803-51945-144955469921306=/root/.ansible/tmp/ansible-tmp-1727204324.064803-51945-144955469921306 <<< 49915 1727204324.09218: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204324.09262: stderr chunk (state=3): >>><<< 49915 1727204324.09286: stdout chunk (state=3): >>><<< 49915 1727204324.09364: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204324.064803-51945-144955469921306=/root/.ansible/tmp/ansible-tmp-1727204324.064803-51945-144955469921306 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204324.09367: variable 'ansible_module_compression' from source: unknown 49915 1727204324.09421: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49915ogiz3nec/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 49915 1727204324.09568: variable 'ansible_facts' from source: unknown 49915 1727204324.09669: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204324.064803-51945-144955469921306/AnsiballZ_command.py 49915 1727204324.09821: Sending initial data 49915 1727204324.10081: Sent initial data (155 bytes) 49915 1727204324.10894: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204324.11193: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204324.11263: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204324.13146: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 49915 1727204324.13203: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49915ogiz3nec/tmp3rk4lgor /root/.ansible/tmp/ansible-tmp-1727204324.064803-51945-144955469921306/AnsiballZ_command.py <<< 49915 1727204324.13212: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204324.064803-51945-144955469921306/AnsiballZ_command.py" <<< 49915 1727204324.13248: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49915ogiz3nec/tmp3rk4lgor" to remote "/root/.ansible/tmp/ansible-tmp-1727204324.064803-51945-144955469921306/AnsiballZ_command.py" <<< 49915 1727204324.13342: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204324.064803-51945-144955469921306/AnsiballZ_command.py" <<< 49915 1727204324.14528: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204324.14531: stdout chunk (state=3): >>><<< 49915 1727204324.14538: stderr chunk (state=3): >>><<< 49915 1727204324.14579: done transferring module to remote 49915 1727204324.14591: _low_level_execute_command(): starting 49915 1727204324.14594: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204324.064803-51945-144955469921306/ /root/.ansible/tmp/ansible-tmp-1727204324.064803-51945-144955469921306/AnsiballZ_command.py && sleep 0' 49915 1727204324.15237: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204324.15246: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204324.15257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204324.15270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204324.15285: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 49915 1727204324.15350: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204324.15435: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204324.15447: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204324.15450: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204324.15510: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204324.17341: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204324.17345: stdout chunk (state=3): >>><<< 49915 1727204324.17382: stderr chunk (state=3): >>><<< 49915 1727204324.17386: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204324.17388: _low_level_execute_command(): starting 49915 1727204324.17390: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204324.064803-51945-144955469921306/AnsiballZ_command.py && sleep 0' 49915 1727204324.18182: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204324.18281: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204324.18284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204324.18286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204324.18288: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 49915 1727204324.18290: stderr chunk (state=3): >>>debug2: match not found <<< 49915 1727204324.18292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204324.18294: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 49915 1727204324.18383: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204324.18388: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204324.18495: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204324.34542: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:e4:80:fb:2d brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.13.254/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 2985sec preferred_lft 2985sec\n inet6 fe80::8ff:e4ff:fe80:fb2d/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.13.254 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.13.254 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:58:44.335115", "end": "2024-09-24 14:58:44.343893", "delta": "0:00:00.008778", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 49915 1727204324.36038: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 49915 1727204324.36062: stderr chunk (state=3): >>><<< 49915 1727204324.36067: stdout chunk (state=3): >>><<< 49915 1727204324.36090: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:e4:80:fb:2d brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.13.254/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 2985sec preferred_lft 2985sec\n inet6 fe80::8ff:e4ff:fe80:fb2d/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.13.254 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.13.254 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:58:44.335115", "end": "2024-09-24 14:58:44.343893", "delta": "0:00:00.008778", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 49915 1727204324.36125: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204324.064803-51945-144955469921306/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49915 1727204324.36132: _low_level_execute_command(): starting 49915 1727204324.36137: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204324.064803-51945-144955469921306/ > /dev/null 2>&1 && sleep 0' 49915 1727204324.36562: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204324.36565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204324.36581: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found <<< 49915 1727204324.36598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204324.36600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204324.36659: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204324.36663: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204324.36667: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204324.36738: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204324.38556: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204324.38584: stderr chunk (state=3): >>><<< 49915 1727204324.38587: stdout chunk (state=3): >>><<< 49915 1727204324.38600: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204324.38606: handler run complete 49915 1727204324.38625: Evaluated conditional (False): False 49915 1727204324.38633: attempt loop complete, returning result 49915 1727204324.38636: _execute() done 49915 1727204324.38638: dumping result to json 49915 1727204324.38644: done dumping result, returning 49915 1727204324.38652: done running TaskExecutor() for managed-node2/TASK: Check routes and DNS [028d2410-947f-dcd7-b5af-000000000b17] 49915 1727204324.38656: sending task result for task 028d2410-947f-dcd7-b5af-000000000b17 49915 1727204324.38758: done sending task result for task 028d2410-947f-dcd7-b5af-000000000b17 49915 1727204324.38760: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008778", "end": "2024-09-24 14:58:44.343893", "rc": 0, "start": "2024-09-24 14:58:44.335115" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 0a:ff:e4:80:fb:2d brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.13.254/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0 valid_lft 2985sec preferred_lft 2985sec inet6 fe80::8ff:e4ff:fe80:fb2d/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.12.1 dev eth0 proto dhcp src 10.31.13.254 metric 100 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.13.254 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 49915 1727204324.38846: no more pending results, returning what we have 49915 1727204324.38850: results queue empty 49915 1727204324.38851: checking for any_errors_fatal 49915 1727204324.38852: done checking for any_errors_fatal 49915 1727204324.38853: checking for max_fail_percentage 49915 1727204324.38855: done checking for max_fail_percentage 49915 1727204324.38856: checking to see if all hosts have failed and the running result is not ok 49915 1727204324.38857: done checking to see if all hosts have failed 49915 1727204324.38857: getting the remaining hosts for this loop 49915 1727204324.38859: done getting the remaining hosts for this loop 49915 1727204324.38862: getting the next task for host managed-node2 49915 1727204324.38868: done getting next task for host managed-node2 49915 1727204324.38870: ^ task is: TASK: Verify DNS and network connectivity 49915 1727204324.38873: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204324.38880: getting variables 49915 1727204324.38881: in VariableManager get_vars() 49915 1727204324.38929: Calling all_inventory to load vars for managed-node2 49915 1727204324.38931: Calling groups_inventory to load vars for managed-node2 49915 1727204324.38933: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204324.38943: Calling all_plugins_play to load vars for managed-node2 49915 1727204324.38946: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204324.38948: Calling groups_plugins_play to load vars for managed-node2 49915 1727204324.39739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204324.40601: done with get_vars() 49915 1727204324.40621: done getting variables 49915 1727204324.40664: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Tuesday 24 September 2024 14:58:44 -0400 (0:00:00.387) 0:00:31.113 ***** 49915 1727204324.40687: entering _queue_task() for managed-node2/shell 49915 1727204324.40931: worker is 1 (out of 1 available) 49915 1727204324.40944: exiting _queue_task() for managed-node2/shell 49915 1727204324.40956: done queuing things up, now waiting for results queue to drain 49915 1727204324.40957: waiting for pending results... 49915 1727204324.41135: running TaskExecutor() for managed-node2/TASK: Verify DNS and network connectivity 49915 1727204324.41199: in run() - task 028d2410-947f-dcd7-b5af-000000000b18 49915 1727204324.41211: variable 'ansible_search_path' from source: unknown 49915 1727204324.41217: variable 'ansible_search_path' from source: unknown 49915 1727204324.41243: calling self._execute() 49915 1727204324.41316: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204324.41320: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204324.41326: variable 'omit' from source: magic vars 49915 1727204324.41594: variable 'ansible_distribution_major_version' from source: facts 49915 1727204324.41602: Evaluated conditional (ansible_distribution_major_version != '6'): True 49915 1727204324.41698: variable 'ansible_facts' from source: unknown 49915 1727204324.42258: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 49915 1727204324.42262: variable 'omit' from source: magic vars 49915 1727204324.42293: variable 'omit' from source: magic vars 49915 1727204324.42318: variable 'omit' from source: magic vars 49915 1727204324.42348: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 49915 1727204324.42377: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 49915 1727204324.42395: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 49915 1727204324.42408: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204324.42419: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 49915 1727204324.42442: variable 'inventory_hostname' from source: host vars for 'managed-node2' 49915 1727204324.42445: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204324.42447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204324.42517: Set connection var ansible_connection to ssh 49915 1727204324.42521: Set connection var ansible_shell_type to sh 49915 1727204324.42523: Set connection var ansible_module_compression to ZIP_DEFLATED 49915 1727204324.42532: Set connection var ansible_shell_executable to /bin/sh 49915 1727204324.42536: Set connection var ansible_timeout to 10 49915 1727204324.42543: Set connection var ansible_pipelining to False 49915 1727204324.42561: variable 'ansible_shell_executable' from source: unknown 49915 1727204324.42563: variable 'ansible_connection' from source: unknown 49915 1727204324.42566: variable 'ansible_module_compression' from source: unknown 49915 1727204324.42568: variable 'ansible_shell_type' from source: unknown 49915 1727204324.42570: variable 'ansible_shell_executable' from source: unknown 49915 1727204324.42573: variable 'ansible_host' from source: host vars for 'managed-node2' 49915 1727204324.42578: variable 'ansible_pipelining' from source: unknown 49915 1727204324.42587: variable 'ansible_timeout' from source: unknown 49915 1727204324.42590: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 49915 1727204324.42707: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204324.42718: variable 'omit' from source: magic vars 49915 1727204324.42721: starting attempt loop 49915 1727204324.42723: running the handler 49915 1727204324.42733: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 49915 1727204324.42750: _low_level_execute_command(): starting 49915 1727204324.42757: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 49915 1727204324.43268: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204324.43272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204324.43277: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 49915 1727204324.43280: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204324.43333: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204324.43336: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204324.43341: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204324.43419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204324.45175: stdout chunk (state=3): >>>/root <<< 49915 1727204324.45268: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204324.45299: stderr chunk (state=3): >>><<< 49915 1727204324.45302: stdout chunk (state=3): >>><<< 49915 1727204324.45322: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204324.45338: _low_level_execute_command(): starting 49915 1727204324.45341: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204324.4532208-51967-80025845286640 `" && echo ansible-tmp-1727204324.4532208-51967-80025845286640="` echo /root/.ansible/tmp/ansible-tmp-1727204324.4532208-51967-80025845286640 `" ) && sleep 0' 49915 1727204324.45796: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204324.45837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204324.45865: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204324.45868: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204324.45871: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204324.45903: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204324.45986: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204324.47938: stdout chunk (state=3): >>>ansible-tmp-1727204324.4532208-51967-80025845286640=/root/.ansible/tmp/ansible-tmp-1727204324.4532208-51967-80025845286640 <<< 49915 1727204324.48099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204324.48102: stdout chunk (state=3): >>><<< 49915 1727204324.48105: stderr chunk (state=3): >>><<< 49915 1727204324.48281: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204324.4532208-51967-80025845286640=/root/.ansible/tmp/ansible-tmp-1727204324.4532208-51967-80025845286640 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204324.48284: variable 'ansible_module_compression' from source: unknown 49915 1727204324.48287: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-49915ogiz3nec/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 49915 1727204324.48289: variable 'ansible_facts' from source: unknown 49915 1727204324.48355: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204324.4532208-51967-80025845286640/AnsiballZ_command.py 49915 1727204324.48559: Sending initial data 49915 1727204324.48569: Sent initial data (155 bytes) 49915 1727204324.48980: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204324.48995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204324.49007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204324.49049: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204324.49072: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204324.49141: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204324.50941: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 49915 1727204324.50953: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 49915 1727204324.51025: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-49915ogiz3nec/tmpxvh0kpzo /root/.ansible/tmp/ansible-tmp-1727204324.4532208-51967-80025845286640/AnsiballZ_command.py <<< 49915 1727204324.51028: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204324.4532208-51967-80025845286640/AnsiballZ_command.py" <<< 49915 1727204324.51085: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-49915ogiz3nec/tmpxvh0kpzo" to remote "/root/.ansible/tmp/ansible-tmp-1727204324.4532208-51967-80025845286640/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204324.4532208-51967-80025845286640/AnsiballZ_command.py" <<< 49915 1727204324.51940: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204324.51945: stdout chunk (state=3): >>><<< 49915 1727204324.51956: stderr chunk (state=3): >>><<< 49915 1727204324.52013: done transferring module to remote 49915 1727204324.52095: _low_level_execute_command(): starting 49915 1727204324.52098: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204324.4532208-51967-80025845286640/ /root/.ansible/tmp/ansible-tmp-1727204324.4532208-51967-80025845286640/AnsiballZ_command.py && sleep 0' 49915 1727204324.52621: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204324.52680: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204324.52808: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204324.52811: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204324.52816: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204324.52818: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204324.52871: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204324.54748: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204324.54752: stdout chunk (state=3): >>><<< 49915 1727204324.54754: stderr chunk (state=3): >>><<< 49915 1727204324.54859: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204324.54865: _low_level_execute_command(): starting 49915 1727204324.54868: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204324.4532208-51967-80025845286640/AnsiballZ_command.py && sleep 0' 49915 1727204324.55468: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204324.55486: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204324.55501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204324.55530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 49915 1727204324.55638: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 <<< 49915 1727204324.55644: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204324.55667: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204324.55686: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204324.55711: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204324.55819: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204325.01029: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1459 0 --:--:-- --:--:-- --:--:-- 1466\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 4373 0 --:--:-- --:--:-- --:--:-- 4409", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-24 14:58:44.709056", "end": "2024-09-24 14:58:45.008206", "delta": "0:00:00.299150", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 49915 1727204325.02630: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. <<< 49915 1727204325.02634: stdout chunk (state=3): >>><<< 49915 1727204325.02636: stderr chunk (state=3): >>><<< 49915 1727204325.02659: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1459 0 --:--:-- --:--:-- --:--:-- 1466\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 4373 0 --:--:-- --:--:-- --:--:-- 4409", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-24 14:58:44.709056", "end": "2024-09-24 14:58:45.008206", "delta": "0:00:00.299150", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.254 closed. 49915 1727204325.02780: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204324.4532208-51967-80025845286640/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 49915 1727204325.02784: _low_level_execute_command(): starting 49915 1727204325.02787: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204324.4532208-51967-80025845286640/ > /dev/null 2>&1 && sleep 0' 49915 1727204325.03390: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 49915 1727204325.03406: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 49915 1727204325.03423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 49915 1727204325.03490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 49915 1727204325.03555: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' <<< 49915 1727204325.03573: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 49915 1727204325.03598: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 49915 1727204325.03702: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 49915 1727204325.05589: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 49915 1727204325.05794: stdout chunk (state=3): >>><<< 49915 1727204325.05797: stderr chunk (state=3): >>><<< 49915 1727204325.05800: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.254 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.254 originally 10.31.13.254 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7e62c1f305' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 49915 1727204325.05802: handler run complete 49915 1727204325.05804: Evaluated conditional (False): False 49915 1727204325.05806: attempt loop complete, returning result 49915 1727204325.05808: _execute() done 49915 1727204325.05810: dumping result to json 49915 1727204325.05812: done dumping result, returning 49915 1727204325.05817: done running TaskExecutor() for managed-node2/TASK: Verify DNS and network connectivity [028d2410-947f-dcd7-b5af-000000000b18] 49915 1727204325.05819: sending task result for task 028d2410-947f-dcd7-b5af-000000000b18 49915 1727204325.05891: done sending task result for task 028d2410-947f-dcd7-b5af-000000000b18 49915 1727204325.05893: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.299150", "end": "2024-09-24 14:58:45.008206", "rc": 0, "start": "2024-09-24 14:58:44.709056" } STDOUT: CHECK DNS AND CONNECTIVITY 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 1459 0 --:--:-- --:--:-- --:--:-- 1466 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 4373 0 --:--:-- --:--:-- --:--:-- 4409 49915 1727204325.05961: no more pending results, returning what we have 49915 1727204325.05964: results queue empty 49915 1727204325.05965: checking for any_errors_fatal 49915 1727204325.05984: done checking for any_errors_fatal 49915 1727204325.05985: checking for max_fail_percentage 49915 1727204325.05987: done checking for max_fail_percentage 49915 1727204325.05988: checking to see if all hosts have failed and the running result is not ok 49915 1727204325.05989: done checking to see if all hosts have failed 49915 1727204325.05989: getting the remaining hosts for this loop 49915 1727204325.05991: done getting the remaining hosts for this loop 49915 1727204325.05999: getting the next task for host managed-node2 49915 1727204325.06008: done getting next task for host managed-node2 49915 1727204325.06009: ^ task is: TASK: meta (flush_handlers) 49915 1727204325.06012: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204325.06019: getting variables 49915 1727204325.06020: in VariableManager get_vars() 49915 1727204325.06062: Calling all_inventory to load vars for managed-node2 49915 1727204325.06065: Calling groups_inventory to load vars for managed-node2 49915 1727204325.06067: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204325.06193: Calling all_plugins_play to load vars for managed-node2 49915 1727204325.06198: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204325.06202: Calling groups_plugins_play to load vars for managed-node2 49915 1727204325.07663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204325.09441: done with get_vars() 49915 1727204325.09466: done getting variables 49915 1727204325.09538: in VariableManager get_vars() 49915 1727204325.09554: Calling all_inventory to load vars for managed-node2 49915 1727204325.09556: Calling groups_inventory to load vars for managed-node2 49915 1727204325.09558: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204325.09563: Calling all_plugins_play to load vars for managed-node2 49915 1727204325.09565: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204325.09567: Calling groups_plugins_play to load vars for managed-node2 49915 1727204325.10692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204325.12360: done with get_vars() 49915 1727204325.12389: done queuing things up, now waiting for results queue to drain 49915 1727204325.12391: results queue empty 49915 1727204325.12391: checking for any_errors_fatal 49915 1727204325.12395: done checking for any_errors_fatal 49915 1727204325.12396: checking for max_fail_percentage 49915 1727204325.12397: done checking for max_fail_percentage 49915 1727204325.12397: checking to see if all hosts have failed and the running result is not ok 49915 1727204325.12398: done checking to see if all hosts have failed 49915 1727204325.12399: getting the remaining hosts for this loop 49915 1727204325.12400: done getting the remaining hosts for this loop 49915 1727204325.12402: getting the next task for host managed-node2 49915 1727204325.12406: done getting next task for host managed-node2 49915 1727204325.12407: ^ task is: TASK: meta (flush_handlers) 49915 1727204325.12408: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204325.12410: getting variables 49915 1727204325.12411: in VariableManager get_vars() 49915 1727204325.12424: Calling all_inventory to load vars for managed-node2 49915 1727204325.12426: Calling groups_inventory to load vars for managed-node2 49915 1727204325.12427: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204325.12432: Calling all_plugins_play to load vars for managed-node2 49915 1727204325.12434: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204325.12436: Calling groups_plugins_play to load vars for managed-node2 49915 1727204325.13818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204325.15728: done with get_vars() 49915 1727204325.15754: done getting variables 49915 1727204325.15806: in VariableManager get_vars() 49915 1727204325.15824: Calling all_inventory to load vars for managed-node2 49915 1727204325.15827: Calling groups_inventory to load vars for managed-node2 49915 1727204325.15829: Calling all_plugins_inventory to load vars for managed-node2 49915 1727204325.15834: Calling all_plugins_play to load vars for managed-node2 49915 1727204325.15836: Calling groups_plugins_inventory to load vars for managed-node2 49915 1727204325.15839: Calling groups_plugins_play to load vars for managed-node2 49915 1727204325.17117: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 49915 1727204325.18824: done with get_vars() 49915 1727204325.18855: done queuing things up, now waiting for results queue to drain 49915 1727204325.18858: results queue empty 49915 1727204325.18859: checking for any_errors_fatal 49915 1727204325.18860: done checking for any_errors_fatal 49915 1727204325.18861: checking for max_fail_percentage 49915 1727204325.18862: done checking for max_fail_percentage 49915 1727204325.18862: checking to see if all hosts have failed and the running result is not ok 49915 1727204325.18863: done checking to see if all hosts have failed 49915 1727204325.18864: getting the remaining hosts for this loop 49915 1727204325.18865: done getting the remaining hosts for this loop 49915 1727204325.18868: getting the next task for host managed-node2 49915 1727204325.18871: done getting next task for host managed-node2 49915 1727204325.18872: ^ task is: None 49915 1727204325.18874: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 49915 1727204325.18877: done queuing things up, now waiting for results queue to drain 49915 1727204325.18878: results queue empty 49915 1727204325.18879: checking for any_errors_fatal 49915 1727204325.18880: done checking for any_errors_fatal 49915 1727204325.18880: checking for max_fail_percentage 49915 1727204325.18882: done checking for max_fail_percentage 49915 1727204325.18882: checking to see if all hosts have failed and the running result is not ok 49915 1727204325.18883: done checking to see if all hosts have failed 49915 1727204325.18885: getting the next task for host managed-node2 49915 1727204325.18888: done getting next task for host managed-node2 49915 1727204325.18889: ^ task is: None 49915 1727204325.18890: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node2 : ok=79 changed=2 unreachable=0 failed=0 skipped=67 rescued=0 ignored=0 Tuesday 24 September 2024 14:58:45 -0400 (0:00:00.782) 0:00:31.895 ***** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 1.88s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.86s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_vlan_mtu_nm.yml:6 fedora.linux_system_roles.network : Check which services are running ---- 1.84s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.35s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Check which packages are installed --- 1.35s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Create veth interface lsr101 -------------------------------------------- 1.13s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Gathering Facts --------------------------------------------------------- 1.02s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:3 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.01s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Install iproute --------------------------------------------------------- 0.99s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Gather the minimum subset of ansible_facts required by the network role test --- 0.86s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Check if system is ostree ----------------------------------------------- 0.79s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Verify DNS and network connectivity ------------------------------------- 0.78s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 fedora.linux_system_roles.network : Check which packages are installed --- 0.78s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Install iproute --------------------------------------------------------- 0.77s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.77s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.70s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.61s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Get stat for interface lsr101.90 ---------------------------------------- 0.49s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Gather current interface info ------------------------------------------- 0.49s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Stat profile file ------------------------------------------------------- 0.46s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 49915 1727204325.19057: RUNNING CLEANUP